Oct 13 18:14:23 crc systemd[1]: Starting Kubernetes Kubelet... Oct 13 18:14:24 crc restorecon[4670]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 18:14:24 crc restorecon[4670]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Oct 13 18:14:24 crc restorecon[4670]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Oct 13 18:14:25 crc kubenswrapper[4974]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 18:14:25 crc kubenswrapper[4974]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Oct 13 18:14:25 crc kubenswrapper[4974]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 18:14:25 crc kubenswrapper[4974]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 18:14:25 crc kubenswrapper[4974]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Oct 13 18:14:25 crc kubenswrapper[4974]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.534789 4974 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537684 4974 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537701 4974 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537706 4974 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537712 4974 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537737 4974 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537743 4974 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537748 4974 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537753 4974 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537757 4974 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537762 4974 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537767 4974 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537772 4974 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537776 4974 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537780 4974 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537785 4974 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537790 4974 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537794 4974 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537798 4974 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537802 4974 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537805 4974 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537809 4974 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537813 4974 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537816 4974 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537820 4974 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537823 4974 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537834 4974 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537838 4974 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537842 4974 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537846 4974 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537850 4974 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537853 4974 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537857 4974 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537861 4974 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537864 4974 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537868 4974 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537871 4974 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537875 4974 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537879 4974 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537882 4974 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537886 4974 feature_gate.go:330] unrecognized feature gate: Example Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537890 4974 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537893 4974 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537898 4974 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537901 4974 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537905 4974 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537910 4974 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537915 4974 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537919 4974 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537924 4974 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537928 4974 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537933 4974 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537938 4974 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537942 4974 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537946 4974 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537949 4974 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537954 4974 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537959 4974 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537963 4974 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537966 4974 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537970 4974 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537973 4974 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537977 4974 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537981 4974 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537984 4974 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537988 4974 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537991 4974 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537994 4974 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.537998 4974 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.538001 4974 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.538005 4974 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.538008 4974 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540219 4974 flags.go:64] FLAG: --address="0.0.0.0" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540236 4974 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540244 4974 flags.go:64] FLAG: --anonymous-auth="true" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540251 4974 flags.go:64] FLAG: --application-metrics-count-limit="100" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540257 4974 flags.go:64] FLAG: --authentication-token-webhook="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540261 4974 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540267 4974 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540273 4974 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540277 4974 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540282 4974 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540287 4974 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540291 4974 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540295 4974 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540299 4974 flags.go:64] FLAG: --cgroup-root="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540303 4974 flags.go:64] FLAG: --cgroups-per-qos="true" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540309 4974 flags.go:64] FLAG: --client-ca-file="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540314 4974 flags.go:64] FLAG: --cloud-config="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540318 4974 flags.go:64] FLAG: --cloud-provider="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540322 4974 flags.go:64] FLAG: --cluster-dns="[]" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540327 4974 flags.go:64] FLAG: --cluster-domain="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540331 4974 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540335 4974 flags.go:64] FLAG: --config-dir="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540339 4974 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540344 4974 flags.go:64] FLAG: --container-log-max-files="5" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540349 4974 flags.go:64] FLAG: --container-log-max-size="10Mi" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540353 4974 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540358 4974 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540362 4974 flags.go:64] FLAG: --containerd-namespace="k8s.io" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540366 4974 flags.go:64] FLAG: --contention-profiling="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540370 4974 flags.go:64] FLAG: --cpu-cfs-quota="true" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540374 4974 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540378 4974 flags.go:64] FLAG: --cpu-manager-policy="none" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540382 4974 flags.go:64] FLAG: --cpu-manager-policy-options="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540387 4974 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540391 4974 flags.go:64] FLAG: --enable-controller-attach-detach="true" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540395 4974 flags.go:64] FLAG: --enable-debugging-handlers="true" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540400 4974 flags.go:64] FLAG: --enable-load-reader="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540404 4974 flags.go:64] FLAG: --enable-server="true" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540408 4974 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540414 4974 flags.go:64] FLAG: --event-burst="100" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540418 4974 flags.go:64] FLAG: --event-qps="50" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540422 4974 flags.go:64] FLAG: --event-storage-age-limit="default=0" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540426 4974 flags.go:64] FLAG: --event-storage-event-limit="default=0" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540430 4974 flags.go:64] FLAG: --eviction-hard="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540436 4974 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540440 4974 flags.go:64] FLAG: --eviction-minimum-reclaim="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540444 4974 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540449 4974 flags.go:64] FLAG: --eviction-soft="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540453 4974 flags.go:64] FLAG: --eviction-soft-grace-period="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540458 4974 flags.go:64] FLAG: --exit-on-lock-contention="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540462 4974 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540466 4974 flags.go:64] FLAG: --experimental-mounter-path="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540470 4974 flags.go:64] FLAG: --fail-cgroupv1="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540474 4974 flags.go:64] FLAG: --fail-swap-on="true" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540479 4974 flags.go:64] FLAG: --feature-gates="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540483 4974 flags.go:64] FLAG: --file-check-frequency="20s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540488 4974 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540492 4974 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540496 4974 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540500 4974 flags.go:64] FLAG: --healthz-port="10248" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540504 4974 flags.go:64] FLAG: --help="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540508 4974 flags.go:64] FLAG: --hostname-override="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540512 4974 flags.go:64] FLAG: --housekeeping-interval="10s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540516 4974 flags.go:64] FLAG: --http-check-frequency="20s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540520 4974 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540524 4974 flags.go:64] FLAG: --image-credential-provider-config="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540528 4974 flags.go:64] FLAG: --image-gc-high-threshold="85" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540532 4974 flags.go:64] FLAG: --image-gc-low-threshold="80" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540536 4974 flags.go:64] FLAG: --image-service-endpoint="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540540 4974 flags.go:64] FLAG: --kernel-memcg-notification="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540544 4974 flags.go:64] FLAG: --kube-api-burst="100" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540548 4974 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540552 4974 flags.go:64] FLAG: --kube-api-qps="50" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540556 4974 flags.go:64] FLAG: --kube-reserved="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540560 4974 flags.go:64] FLAG: --kube-reserved-cgroup="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540564 4974 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540569 4974 flags.go:64] FLAG: --kubelet-cgroups="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540573 4974 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540577 4974 flags.go:64] FLAG: --lock-file="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540581 4974 flags.go:64] FLAG: --log-cadvisor-usage="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540586 4974 flags.go:64] FLAG: --log-flush-frequency="5s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540591 4974 flags.go:64] FLAG: --log-json-info-buffer-size="0" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540597 4974 flags.go:64] FLAG: --log-json-split-stream="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540601 4974 flags.go:64] FLAG: --log-text-info-buffer-size="0" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540605 4974 flags.go:64] FLAG: --log-text-split-stream="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540610 4974 flags.go:64] FLAG: --logging-format="text" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540614 4974 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540618 4974 flags.go:64] FLAG: --make-iptables-util-chains="true" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540623 4974 flags.go:64] FLAG: --manifest-url="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540627 4974 flags.go:64] FLAG: --manifest-url-header="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540633 4974 flags.go:64] FLAG: --max-housekeeping-interval="15s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540638 4974 flags.go:64] FLAG: --max-open-files="1000000" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540644 4974 flags.go:64] FLAG: --max-pods="110" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540648 4974 flags.go:64] FLAG: --maximum-dead-containers="-1" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540666 4974 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540671 4974 flags.go:64] FLAG: --memory-manager-policy="None" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540675 4974 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540679 4974 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540683 4974 flags.go:64] FLAG: --node-ip="192.168.126.11" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540688 4974 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540698 4974 flags.go:64] FLAG: --node-status-max-images="50" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540703 4974 flags.go:64] FLAG: --node-status-update-frequency="10s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540707 4974 flags.go:64] FLAG: --oom-score-adj="-999" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540711 4974 flags.go:64] FLAG: --pod-cidr="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540715 4974 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540722 4974 flags.go:64] FLAG: --pod-manifest-path="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540726 4974 flags.go:64] FLAG: --pod-max-pids="-1" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540730 4974 flags.go:64] FLAG: --pods-per-core="0" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540734 4974 flags.go:64] FLAG: --port="10250" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540738 4974 flags.go:64] FLAG: --protect-kernel-defaults="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540742 4974 flags.go:64] FLAG: --provider-id="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540747 4974 flags.go:64] FLAG: --qos-reserved="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540751 4974 flags.go:64] FLAG: --read-only-port="10255" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540756 4974 flags.go:64] FLAG: --register-node="true" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540760 4974 flags.go:64] FLAG: --register-schedulable="true" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540763 4974 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540771 4974 flags.go:64] FLAG: --registry-burst="10" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540775 4974 flags.go:64] FLAG: --registry-qps="5" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540779 4974 flags.go:64] FLAG: --reserved-cpus="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540783 4974 flags.go:64] FLAG: --reserved-memory="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540789 4974 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540793 4974 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540797 4974 flags.go:64] FLAG: --rotate-certificates="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540801 4974 flags.go:64] FLAG: --rotate-server-certificates="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540805 4974 flags.go:64] FLAG: --runonce="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540809 4974 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540813 4974 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540817 4974 flags.go:64] FLAG: --seccomp-default="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540821 4974 flags.go:64] FLAG: --serialize-image-pulls="true" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540825 4974 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540830 4974 flags.go:64] FLAG: --storage-driver-db="cadvisor" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540834 4974 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540838 4974 flags.go:64] FLAG: --storage-driver-password="root" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540842 4974 flags.go:64] FLAG: --storage-driver-secure="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540846 4974 flags.go:64] FLAG: --storage-driver-table="stats" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540850 4974 flags.go:64] FLAG: --storage-driver-user="root" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540854 4974 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540858 4974 flags.go:64] FLAG: --sync-frequency="1m0s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540862 4974 flags.go:64] FLAG: --system-cgroups="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540866 4974 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540872 4974 flags.go:64] FLAG: --system-reserved-cgroup="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540876 4974 flags.go:64] FLAG: --tls-cert-file="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540883 4974 flags.go:64] FLAG: --tls-cipher-suites="[]" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540888 4974 flags.go:64] FLAG: --tls-min-version="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540892 4974 flags.go:64] FLAG: --tls-private-key-file="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540897 4974 flags.go:64] FLAG: --topology-manager-policy="none" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540900 4974 flags.go:64] FLAG: --topology-manager-policy-options="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540905 4974 flags.go:64] FLAG: --topology-manager-scope="container" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540909 4974 flags.go:64] FLAG: --v="2" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540915 4974 flags.go:64] FLAG: --version="false" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540920 4974 flags.go:64] FLAG: --vmodule="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540926 4974 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.540930 4974 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541036 4974 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541041 4974 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541045 4974 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541049 4974 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541053 4974 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541057 4974 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541060 4974 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541064 4974 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541067 4974 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541070 4974 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541075 4974 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541078 4974 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541083 4974 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541088 4974 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541093 4974 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541097 4974 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541101 4974 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541105 4974 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541110 4974 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541114 4974 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541118 4974 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541125 4974 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541130 4974 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541134 4974 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541138 4974 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541143 4974 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541147 4974 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541152 4974 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541156 4974 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541160 4974 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541165 4974 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541169 4974 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541172 4974 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541175 4974 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541179 4974 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541182 4974 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541187 4974 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541190 4974 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541194 4974 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541198 4974 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541201 4974 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541205 4974 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541209 4974 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541214 4974 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541218 4974 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541222 4974 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541226 4974 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541232 4974 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541237 4974 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541241 4974 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541246 4974 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541249 4974 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541254 4974 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541261 4974 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541265 4974 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541269 4974 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541278 4974 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541283 4974 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541287 4974 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541291 4974 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541295 4974 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541300 4974 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541307 4974 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541311 4974 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541316 4974 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541320 4974 feature_gate.go:330] unrecognized feature gate: Example Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541325 4974 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541328 4974 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541332 4974 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541336 4974 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.541340 4974 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.541352 4974 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.555396 4974 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.555461 4974 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555604 4974 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555628 4974 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555637 4974 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555681 4974 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555693 4974 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555702 4974 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555710 4974 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555718 4974 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555727 4974 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555736 4974 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555745 4974 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555753 4974 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555762 4974 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555770 4974 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555779 4974 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555788 4974 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555796 4974 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555804 4974 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555813 4974 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555825 4974 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555838 4974 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555853 4974 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555863 4974 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555874 4974 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555884 4974 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555894 4974 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555903 4974 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555912 4974 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555920 4974 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555929 4974 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555937 4974 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555946 4974 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555954 4974 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555963 4974 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555973 4974 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555982 4974 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555990 4974 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.555999 4974 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556008 4974 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556017 4974 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556025 4974 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556034 4974 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556043 4974 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556051 4974 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556062 4974 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556073 4974 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556082 4974 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556091 4974 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556102 4974 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556113 4974 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556122 4974 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556132 4974 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556141 4974 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556150 4974 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556159 4974 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556172 4974 feature_gate.go:330] unrecognized feature gate: Example Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556181 4974 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556189 4974 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556198 4974 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556206 4974 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556215 4974 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556223 4974 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556231 4974 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556240 4974 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556248 4974 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556256 4974 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556264 4974 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556273 4974 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556281 4974 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556289 4974 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556298 4974 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.556317 4974 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556585 4974 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556604 4974 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556617 4974 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556628 4974 feature_gate.go:330] unrecognized feature gate: NewOLM Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556636 4974 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556645 4974 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556682 4974 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556691 4974 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556699 4974 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556709 4974 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556718 4974 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556726 4974 feature_gate.go:330] unrecognized feature gate: OVNObservability Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556735 4974 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556743 4974 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556751 4974 feature_gate.go:330] unrecognized feature gate: SignatureStores Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556761 4974 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556770 4974 feature_gate.go:330] unrecognized feature gate: GatewayAPI Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556779 4974 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556788 4974 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556797 4974 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556805 4974 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556815 4974 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556826 4974 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556835 4974 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556845 4974 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556854 4974 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556862 4974 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556870 4974 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556878 4974 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556887 4974 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556895 4974 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556903 4974 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556912 4974 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556920 4974 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556929 4974 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556939 4974 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556947 4974 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556955 4974 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556963 4974 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556972 4974 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556980 4974 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556989 4974 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.556997 4974 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557005 4974 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557013 4974 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557022 4974 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557033 4974 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557044 4974 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557053 4974 feature_gate.go:330] unrecognized feature gate: PinnedImages Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557062 4974 feature_gate.go:330] unrecognized feature gate: PlatformOperators Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557071 4974 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557082 4974 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557092 4974 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557101 4974 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557109 4974 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557117 4974 feature_gate.go:330] unrecognized feature gate: Example Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557126 4974 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557134 4974 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557142 4974 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557150 4974 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557159 4974 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557167 4974 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557176 4974 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557185 4974 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557193 4974 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557201 4974 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557210 4974 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557218 4974 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557227 4974 feature_gate.go:330] unrecognized feature gate: InsightsConfig Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557235 4974 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.557244 4974 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.557258 4974 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.557550 4974 server.go:940] "Client rotation is on, will bootstrap in background" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.566043 4974 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.566186 4974 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.568194 4974 server.go:997] "Starting client certificate rotation" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.568242 4974 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.569113 4974 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-12 06:17:58.908274135 +0000 UTC Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.569199 4974 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 708h3m33.339077103s for next certificate rotation Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.601475 4974 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.606265 4974 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.627616 4974 log.go:25] "Validated CRI v1 runtime API" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.670303 4974 log.go:25] "Validated CRI v1 image API" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.672425 4974 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.678699 4974 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-10-13-17-48-14-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.678743 4974 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.705991 4974 manager.go:217] Machine: {Timestamp:2025-10-13 18:14:25.701219986 +0000 UTC m=+0.605586146 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7fb80a93-bf09-453c-9c6a-784a87b26241 BootID:71c32b41-77fa-4116-87b2-213f1ff9d252 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:2e:95:10 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:2e:95:10 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:42:9c:be Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:fe:5e:c4 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:64:dc:48 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:b2:ef:28 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:7e:b2:2e:78:66:c6 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:6e:4a:98:ac:85:14 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.706401 4974 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.706692 4974 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.709197 4974 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.709523 4974 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.709579 4974 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.709925 4974 topology_manager.go:138] "Creating topology manager with none policy" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.709943 4974 container_manager_linux.go:303] "Creating device plugin manager" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.710876 4974 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.710924 4974 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.711183 4974 state_mem.go:36] "Initialized new in-memory state store" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.711329 4974 server.go:1245] "Using root directory" path="/var/lib/kubelet" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.715441 4974 kubelet.go:418] "Attempting to sync node with API server" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.715486 4974 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.715514 4974 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.715541 4974 kubelet.go:324] "Adding apiserver pod source" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.715560 4974 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.720609 4974 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.726015 4974 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:25 crc kubenswrapper[4974]: E1013 18:14:25.726258 4974 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.726317 4974 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:25 crc kubenswrapper[4974]: E1013 18:14:25.726428 4974 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.726411 4974 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.729967 4974 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.733582 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.733635 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.733684 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.733704 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.733734 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.733758 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.733776 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.733804 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.733832 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.733850 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.733881 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.733899 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.736027 4974 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.737004 4974 server.go:1280] "Started kubelet" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.737887 4974 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.737904 4974 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.739025 4974 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Oct 13 18:14:25 crc systemd[1]: Started Kubernetes Kubelet. Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.739602 4974 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.740306 4974 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.740362 4974 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.740649 4974 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 05:45:35.308250834 +0000 UTC Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.740848 4974 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1259h31m9.567409393s for next certificate rotation Oct 13 18:14:25 crc kubenswrapper[4974]: E1013 18:14:25.740736 4974 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.740759 4974 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.740729 4974 volume_manager.go:287] "The desired_state_of_world populator starts" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.741411 4974 volume_manager.go:289] "Starting Kubelet Volume Manager" Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.741703 4974 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:25 crc kubenswrapper[4974]: E1013 18:14:25.741846 4974 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.741470 4974 server.go:460] "Adding debug handlers to kubelet server" Oct 13 18:14:25 crc kubenswrapper[4974]: E1013 18:14:25.742687 4974 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="200ms" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.743250 4974 factory.go:55] Registering systemd factory Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.743305 4974 factory.go:221] Registration of the systemd container factory successfully Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.743822 4974 factory.go:153] Registering CRI-O factory Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.743865 4974 factory.go:221] Registration of the crio container factory successfully Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.743981 4974 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.744039 4974 factory.go:103] Registering Raw factory Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.744069 4974 manager.go:1196] Started watching for new ooms in manager Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.745857 4974 manager.go:319] Starting recovery of all containers Oct 13 18:14:25 crc kubenswrapper[4974]: E1013 18:14:25.746168 4974 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.30:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.186e1f9d8436bc11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-13 18:14:25.736948753 +0000 UTC m=+0.641314863,LastTimestamp:2025-10-13 18:14:25.736948753 +0000 UTC m=+0.641314863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761440 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761530 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761554 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761575 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761595 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761614 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761632 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761754 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761785 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761814 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761833 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761853 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761872 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761896 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761915 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761934 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761955 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761973 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.761990 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762046 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762066 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762086 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762106 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762125 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762144 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762162 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762184 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762205 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762223 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762239 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762258 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762275 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762295 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762313 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762365 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762387 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762408 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762428 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762447 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762466 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762524 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762546 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762568 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762586 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762605 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762699 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762720 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762741 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762760 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762779 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762826 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762845 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762927 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762959 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.762979 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763007 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763037 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763087 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763154 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763180 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763200 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763218 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763236 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763255 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763274 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763293 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763352 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763371 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763388 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763405 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763492 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763511 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763527 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763545 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763591 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763609 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763628 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763645 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763695 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763713 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763732 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763750 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763801 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763819 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763837 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763854 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763873 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763891 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763910 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763929 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763974 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.763992 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764010 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764027 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764045 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764066 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764084 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764102 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764146 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764173 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764191 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764209 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764228 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764247 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764357 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764380 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764443 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764462 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764536 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764556 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764619 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764640 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764687 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764708 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764759 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764777 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764795 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764815 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764832 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764850 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764867 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764885 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764979 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.764997 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765014 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765031 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765050 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765067 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765085 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765103 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765147 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765165 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765183 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765203 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765221 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765238 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765255 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765272 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765380 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765407 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765432 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765455 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765477 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765501 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765527 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765546 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765709 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765731 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765747 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765764 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765782 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765799 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765817 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765835 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765898 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765917 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765935 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765952 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765971 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.765990 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.766007 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.766025 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.766046 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.766063 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.766079 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.766100 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.766130 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.766154 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.766175 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.766192 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.766211 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.769885 4974 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.769964 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.769992 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770015 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770039 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770071 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770093 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770114 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770132 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770152 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770172 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770191 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770216 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770241 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770272 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770298 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770325 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770355 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770385 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770413 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770442 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770471 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770500 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770581 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770614 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770643 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770705 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770736 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770764 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770790 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770817 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770848 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770874 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770900 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770928 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770958 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.770986 4974 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.771011 4974 reconstruct.go:97] "Volume reconstruction finished" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.771028 4974 reconciler.go:26] "Reconciler: start to sync state" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.785198 4974 manager.go:324] Recovery completed Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.797989 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.801056 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.801158 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.801179 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.802541 4974 cpu_manager.go:225] "Starting CPU manager" policy="none" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.802570 4974 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.802602 4974 state_mem.go:36] "Initialized new in-memory state store" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.807871 4974 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.810249 4974 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.810312 4974 status_manager.go:217] "Starting to sync pod status with apiserver" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.810355 4974 kubelet.go:2335] "Starting kubelet main sync loop" Oct 13 18:14:25 crc kubenswrapper[4974]: E1013 18:14:25.810427 4974 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Oct 13 18:14:25 crc kubenswrapper[4974]: W1013 18:14:25.814563 4974 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:25 crc kubenswrapper[4974]: E1013 18:14:25.814627 4974 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.816040 4974 policy_none.go:49] "None policy: Start" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.816873 4974 memory_manager.go:170] "Starting memorymanager" policy="None" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.816917 4974 state_mem.go:35] "Initializing new in-memory state store" Oct 13 18:14:25 crc kubenswrapper[4974]: E1013 18:14:25.842025 4974 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.870000 4974 manager.go:334] "Starting Device Plugin manager" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.870089 4974 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.870106 4974 server.go:79] "Starting device plugin registration server" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.870676 4974 eviction_manager.go:189] "Eviction manager: starting control loop" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.870695 4974 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.871282 4974 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.871439 4974 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.871467 4974 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Oct 13 18:14:25 crc kubenswrapper[4974]: E1013 18:14:25.878685 4974 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.910890 4974 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.911119 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.912569 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.912616 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.912632 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.912827 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.913257 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.913313 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.913790 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.913832 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.913847 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.914013 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.915354 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.915463 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.916720 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.916761 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.916779 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.917234 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.917251 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.917295 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.917328 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.917363 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.917385 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.918404 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.918867 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.918963 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.920556 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.920573 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.920605 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.920623 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.920637 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.920682 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.920822 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.920961 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.921002 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.922248 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.922246 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.922304 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.922320 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.922283 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.922384 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.922470 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.922494 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.923277 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.923315 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.923332 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:25 crc kubenswrapper[4974]: E1013 18:14:25.944297 4974 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="400ms" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.971397 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.972561 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.972640 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.972694 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.972733 4974 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 18:14:25 crc kubenswrapper[4974]: E1013 18:14:25.973370 4974 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977017 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977079 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977118 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977188 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977233 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977372 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977420 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977446 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977472 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977496 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977518 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977538 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977579 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977631 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 18:14:25 crc kubenswrapper[4974]: I1013 18:14:25.977689 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.078990 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079079 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079135 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079198 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079245 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079264 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079338 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079265 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079352 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079391 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079379 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079415 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079379 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079439 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079466 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079514 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079558 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079527 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079708 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079743 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079790 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079809 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079878 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079900 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079912 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079952 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079820 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079973 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.079990 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.080057 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.174259 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.175375 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.175426 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.175438 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.175463 4974 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 18:14:26 crc kubenswrapper[4974]: E1013 18:14:26.175957 4974 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.269847 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.293561 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.307573 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: W1013 18:14:26.316501 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-ea7a6ff35d479cf81335e49d68ce68fdd2e6f31742a5a6bd67aca686b5b96f39 WatchSource:0}: Error finding container ea7a6ff35d479cf81335e49d68ce68fdd2e6f31742a5a6bd67aca686b5b96f39: Status 404 returned error can't find the container with id ea7a6ff35d479cf81335e49d68ce68fdd2e6f31742a5a6bd67aca686b5b96f39 Oct 13 18:14:26 crc kubenswrapper[4974]: W1013 18:14:26.332010 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-adcada918b753c6f425970da63822a10c68d1ea9b6cfd8959eb407fc36ed9b6e WatchSource:0}: Error finding container adcada918b753c6f425970da63822a10c68d1ea9b6cfd8959eb407fc36ed9b6e: Status 404 returned error can't find the container with id adcada918b753c6f425970da63822a10c68d1ea9b6cfd8959eb407fc36ed9b6e Oct 13 18:14:26 crc kubenswrapper[4974]: W1013 18:14:26.333636 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-f4292cf72e087a29c6e9ccc1755b9dfa7c556996f6f49f028ccd81114aea25b1 WatchSource:0}: Error finding container f4292cf72e087a29c6e9ccc1755b9dfa7c556996f6f49f028ccd81114aea25b1: Status 404 returned error can't find the container with id f4292cf72e087a29c6e9ccc1755b9dfa7c556996f6f49f028ccd81114aea25b1 Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.334782 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.342884 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 18:14:26 crc kubenswrapper[4974]: E1013 18:14:26.345142 4974 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="800ms" Oct 13 18:14:26 crc kubenswrapper[4974]: W1013 18:14:26.356241 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-bb4887ee1204e6363a37007732af29618508e5d1e467c9fe1a0e7117a4195507 WatchSource:0}: Error finding container bb4887ee1204e6363a37007732af29618508e5d1e467c9fe1a0e7117a4195507: Status 404 returned error can't find the container with id bb4887ee1204e6363a37007732af29618508e5d1e467c9fe1a0e7117a4195507 Oct 13 18:14:26 crc kubenswrapper[4974]: W1013 18:14:26.365732 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-b8bb0e07393d30283d08123045c1100ba4c6abdec4807542ec4d325d2c3efe64 WatchSource:0}: Error finding container b8bb0e07393d30283d08123045c1100ba4c6abdec4807542ec4d325d2c3efe64: Status 404 returned error can't find the container with id b8bb0e07393d30283d08123045c1100ba4c6abdec4807542ec4d325d2c3efe64 Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.576680 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.578568 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.578629 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.578729 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.578768 4974 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 18:14:26 crc kubenswrapper[4974]: E1013 18:14:26.579350 4974 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Oct 13 18:14:26 crc kubenswrapper[4974]: W1013 18:14:26.662392 4974 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:26 crc kubenswrapper[4974]: E1013 18:14:26.662513 4974 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.740613 4974 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.818219 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bb4887ee1204e6363a37007732af29618508e5d1e467c9fe1a0e7117a4195507"} Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.819540 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f4292cf72e087a29c6e9ccc1755b9dfa7c556996f6f49f028ccd81114aea25b1"} Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.821470 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"adcada918b753c6f425970da63822a10c68d1ea9b6cfd8959eb407fc36ed9b6e"} Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.823181 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ea7a6ff35d479cf81335e49d68ce68fdd2e6f31742a5a6bd67aca686b5b96f39"} Oct 13 18:14:26 crc kubenswrapper[4974]: I1013 18:14:26.824164 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b8bb0e07393d30283d08123045c1100ba4c6abdec4807542ec4d325d2c3efe64"} Oct 13 18:14:26 crc kubenswrapper[4974]: W1013 18:14:26.933913 4974 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:26 crc kubenswrapper[4974]: E1013 18:14:26.934003 4974 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Oct 13 18:14:27 crc kubenswrapper[4974]: W1013 18:14:27.015875 4974 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:27 crc kubenswrapper[4974]: E1013 18:14:27.015987 4974 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Oct 13 18:14:27 crc kubenswrapper[4974]: E1013 18:14:27.146030 4974 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="1.6s" Oct 13 18:14:27 crc kubenswrapper[4974]: W1013 18:14:27.270647 4974 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:27 crc kubenswrapper[4974]: E1013 18:14:27.270794 4974 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.380326 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.381811 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.381860 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.381870 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.381894 4974 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 18:14:27 crc kubenswrapper[4974]: E1013 18:14:27.382334 4974 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.740955 4974 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.830005 4974 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16" exitCode=0 Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.830106 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16"} Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.830177 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.831957 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.832014 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.832038 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.835726 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4"} Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.835775 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d"} Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.835787 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.835801 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e"} Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.835827 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93"} Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.837239 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.837416 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.837443 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.840468 4974 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c" exitCode=0 Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.840715 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.840770 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c"} Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.843443 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.843514 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.843550 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.845593 4974 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e" exitCode=0 Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.845718 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e"} Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.846017 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.847543 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.847614 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.847642 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.849509 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.849771 4974 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="66c168a412844860c141c82e64731411223f40ddf9ea57e2bf812d02216b44e9" exitCode=0 Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.849824 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"66c168a412844860c141c82e64731411223f40ddf9ea57e2bf812d02216b44e9"} Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.849923 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.851227 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.851286 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.851293 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.851360 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.851322 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:27 crc kubenswrapper[4974]: I1013 18:14:27.851460 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.225218 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:28 crc kubenswrapper[4974]: W1013 18:14:28.738711 4974 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:28 crc kubenswrapper[4974]: E1013 18:14:28.738826 4974 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.741229 4974 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:28 crc kubenswrapper[4974]: E1013 18:14:28.746907 4974 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.30:6443: connect: connection refused" interval="3.2s" Oct 13 18:14:28 crc kubenswrapper[4974]: W1013 18:14:28.833915 4974 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.30:6443: connect: connection refused Oct 13 18:14:28 crc kubenswrapper[4974]: E1013 18:14:28.834033 4974 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.30:6443: connect: connection refused" logger="UnhandledError" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.858354 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c44540d6f9c2317f4194ee49126bed6b9cc86346979eddff48829c63853b131a"} Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.858402 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"fd05f0e33255ca34350d583298cdb6e4cef6573c67613d31cf53f60d2cae2792"} Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.858413 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6c76aeb1156f079b7cbc4c0553eeea4cf5929dc0aa0ab6e6010a9f4588ecb6f8"} Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.858422 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.859639 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.859697 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.859740 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.863595 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64"} Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.863798 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442"} Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.863833 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9"} Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.863854 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf"} Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.867124 4974 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319" exitCode=0 Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.867222 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319"} Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.867307 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.868693 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.868732 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.868747 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.870699 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"bff0fea41490697624b947252e1babd980715a0a78dac154faa90511067e736b"} Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.870763 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.870790 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.874494 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.874580 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.875211 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.875320 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.875361 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.875379 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.982778 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.984806 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.984846 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.984858 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:28 crc kubenswrapper[4974]: I1013 18:14:28.984886 4974 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 18:14:28 crc kubenswrapper[4974]: E1013 18:14:28.985426 4974 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.30:6443: connect: connection refused" node="crc" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.877245 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c"} Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.877410 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.879205 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.879272 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.879291 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.879947 4974 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a" exitCode=0 Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.880106 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.880010 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a"} Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.880174 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.880242 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.880739 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.880808 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.881384 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.881437 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.881449 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.881595 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.881682 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.881711 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.881929 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.881991 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.882006 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.882645 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.882742 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:29 crc kubenswrapper[4974]: I1013 18:14:29.882764 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:30 crc kubenswrapper[4974]: I1013 18:14:30.890188 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c"} Oct 13 18:14:30 crc kubenswrapper[4974]: I1013 18:14:30.890253 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:30 crc kubenswrapper[4974]: I1013 18:14:30.890291 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9"} Oct 13 18:14:30 crc kubenswrapper[4974]: I1013 18:14:30.890317 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc"} Oct 13 18:14:30 crc kubenswrapper[4974]: I1013 18:14:30.890393 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:30 crc kubenswrapper[4974]: I1013 18:14:30.891884 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:30 crc kubenswrapper[4974]: I1013 18:14:30.891943 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:30 crc kubenswrapper[4974]: I1013 18:14:30.891971 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.225995 4974 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.226092 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.237418 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.237963 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.239914 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.239954 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.239965 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.698485 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.698786 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.700199 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.700254 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.700278 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.898442 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.898421 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95"} Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.898602 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323"} Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.898704 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.899739 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.899805 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.899829 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.900607 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.900688 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:31 crc kubenswrapper[4974]: I1013 18:14:31.900729 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:32 crc kubenswrapper[4974]: I1013 18:14:32.186572 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:32 crc kubenswrapper[4974]: I1013 18:14:32.188055 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:32 crc kubenswrapper[4974]: I1013 18:14:32.188113 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:32 crc kubenswrapper[4974]: I1013 18:14:32.188140 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:32 crc kubenswrapper[4974]: I1013 18:14:32.188179 4974 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 18:14:32 crc kubenswrapper[4974]: I1013 18:14:32.900639 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:32 crc kubenswrapper[4974]: I1013 18:14:32.901988 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:32 crc kubenswrapper[4974]: I1013 18:14:32.902054 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:32 crc kubenswrapper[4974]: I1013 18:14:32.902080 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:33 crc kubenswrapper[4974]: I1013 18:14:33.094458 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:33 crc kubenswrapper[4974]: I1013 18:14:33.094749 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:33 crc kubenswrapper[4974]: I1013 18:14:33.096521 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:33 crc kubenswrapper[4974]: I1013 18:14:33.096608 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:33 crc kubenswrapper[4974]: I1013 18:14:33.096637 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:33 crc kubenswrapper[4974]: I1013 18:14:33.564064 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:33 crc kubenswrapper[4974]: I1013 18:14:33.903825 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:33 crc kubenswrapper[4974]: I1013 18:14:33.905226 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:33 crc kubenswrapper[4974]: I1013 18:14:33.905302 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:33 crc kubenswrapper[4974]: I1013 18:14:33.905337 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.172704 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.172981 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.174488 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.174518 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.174529 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.456823 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.457095 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.458755 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.458814 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.458840 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.657315 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Oct 13 18:14:35 crc kubenswrapper[4974]: E1013 18:14:35.878808 4974 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.909082 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.910496 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.910549 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:35 crc kubenswrapper[4974]: I1013 18:14:35.910562 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:37 crc kubenswrapper[4974]: I1013 18:14:37.180902 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:37 crc kubenswrapper[4974]: I1013 18:14:37.181094 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:37 crc kubenswrapper[4974]: I1013 18:14:37.182983 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:37 crc kubenswrapper[4974]: I1013 18:14:37.183055 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:37 crc kubenswrapper[4974]: I1013 18:14:37.183073 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:37 crc kubenswrapper[4974]: I1013 18:14:37.191939 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:37 crc kubenswrapper[4974]: I1013 18:14:37.915054 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:37 crc kubenswrapper[4974]: I1013 18:14:37.916849 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:37 crc kubenswrapper[4974]: I1013 18:14:37.916914 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:37 crc kubenswrapper[4974]: I1013 18:14:37.916934 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:37 crc kubenswrapper[4974]: I1013 18:14:37.922411 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:38 crc kubenswrapper[4974]: I1013 18:14:38.918995 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:38 crc kubenswrapper[4974]: I1013 18:14:38.920757 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:38 crc kubenswrapper[4974]: I1013 18:14:38.920838 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:38 crc kubenswrapper[4974]: I1013 18:14:38.920861 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:39 crc kubenswrapper[4974]: W1013 18:14:39.608514 4974 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 13 18:14:39 crc kubenswrapper[4974]: I1013 18:14:39.608668 4974 trace.go:236] Trace[763622107]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Oct-2025 18:14:29.606) (total time: 10001ms): Oct 13 18:14:39 crc kubenswrapper[4974]: Trace[763622107]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:14:39.608) Oct 13 18:14:39 crc kubenswrapper[4974]: Trace[763622107]: [10.001694641s] [10.001694641s] END Oct 13 18:14:39 crc kubenswrapper[4974]: E1013 18:14:39.608732 4974 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 13 18:14:39 crc kubenswrapper[4974]: I1013 18:14:39.741989 4974 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Oct 13 18:14:40 crc kubenswrapper[4974]: E1013 18:14:40.268899 4974 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": net/http: TLS handshake timeout" event="&Event{ObjectMeta:{crc.186e1f9d8436bc11 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-10-13 18:14:25.736948753 +0000 UTC m=+0.641314863,LastTimestamp:2025-10-13 18:14:25.736948753 +0000 UTC m=+0.641314863,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Oct 13 18:14:40 crc kubenswrapper[4974]: W1013 18:14:40.421798 4974 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Oct 13 18:14:40 crc kubenswrapper[4974]: I1013 18:14:40.421882 4974 trace.go:236] Trace[512005150]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Oct-2025 18:14:30.420) (total time: 10001ms): Oct 13 18:14:40 crc kubenswrapper[4974]: Trace[512005150]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:14:40.421) Oct 13 18:14:40 crc kubenswrapper[4974]: Trace[512005150]: [10.001804464s] [10.001804464s] END Oct 13 18:14:40 crc kubenswrapper[4974]: E1013 18:14:40.421901 4974 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Oct 13 18:14:40 crc kubenswrapper[4974]: I1013 18:14:40.617020 4974 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 13 18:14:40 crc kubenswrapper[4974]: I1013 18:14:40.617098 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 13 18:14:40 crc kubenswrapper[4974]: I1013 18:14:40.624962 4974 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Oct 13 18:14:40 crc kubenswrapper[4974]: I1013 18:14:40.625050 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Oct 13 18:14:41 crc kubenswrapper[4974]: I1013 18:14:41.226010 4974 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 13 18:14:41 crc kubenswrapper[4974]: I1013 18:14:41.226157 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 13 18:14:43 crc kubenswrapper[4974]: I1013 18:14:43.572696 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:43 crc kubenswrapper[4974]: I1013 18:14:43.572928 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:43 crc kubenswrapper[4974]: I1013 18:14:43.575117 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:43 crc kubenswrapper[4974]: I1013 18:14:43.575181 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:43 crc kubenswrapper[4974]: I1013 18:14:43.575200 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:43 crc kubenswrapper[4974]: I1013 18:14:43.579814 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:43 crc kubenswrapper[4974]: I1013 18:14:43.931996 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:14:43 crc kubenswrapper[4974]: I1013 18:14:43.932069 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:43 crc kubenswrapper[4974]: I1013 18:14:43.933380 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:43 crc kubenswrapper[4974]: I1013 18:14:43.933460 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:43 crc kubenswrapper[4974]: I1013 18:14:43.933482 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:44 crc kubenswrapper[4974]: I1013 18:14:44.463202 4974 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Oct 13 18:14:44 crc kubenswrapper[4974]: I1013 18:14:44.672132 4974 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.613954 4974 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.618163 4974 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.618858 4974 trace.go:236] Trace[1852401736]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Oct-2025 18:14:31.986) (total time: 13631ms): Oct 13 18:14:45 crc kubenswrapper[4974]: Trace[1852401736]: ---"Objects listed" error: 13631ms (18:14:45.618) Oct 13 18:14:45 crc kubenswrapper[4974]: Trace[1852401736]: [13.631667186s] [13.631667186s] END Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.618905 4974 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.623820 4974 trace.go:236] Trace[1237860928]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Oct-2025 18:14:35.232) (total time: 10391ms): Oct 13 18:14:45 crc kubenswrapper[4974]: Trace[1237860928]: ---"Objects listed" error: 10391ms (18:14:45.623) Oct 13 18:14:45 crc kubenswrapper[4974]: Trace[1237860928]: [10.391592923s] [10.391592923s] END Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.623866 4974 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.624141 4974 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.666772 4974 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50704->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.666769 4974 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35512->192.168.126.11:17697: read: connection reset by peer" start-of-body= Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.666869 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:50704->192.168.126.11:17697: read: connection reset by peer" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.666955 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:35512->192.168.126.11:17697: read: connection reset by peer" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.667309 4974 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.667356 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.700110 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.719883 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.731189 4974 apiserver.go:52] "Watching apiserver" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.735591 4974 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.736049 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.736616 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.736765 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.736809 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.736855 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.736891 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.737041 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.737109 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.737522 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.737580 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.738985 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.739003 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.740385 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.740582 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.740783 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.741006 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.743137 4974 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.743829 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.743873 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.744322 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.782434 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.803274 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.820235 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.820492 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.820596 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.820641 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.820698 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.820743 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.820950 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.820978 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821036 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821090 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821136 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821176 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821220 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821260 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821274 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821304 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821341 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821381 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821425 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821458 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821495 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821540 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821585 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821621 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821686 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821729 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821719 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821764 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.821805 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822064 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822104 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822137 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822160 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822197 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822222 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822252 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822276 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822300 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822294 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822326 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822420 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822497 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.822955 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.823015 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.823026 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.823072 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.823092 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.823231 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.823242 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.823290 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824138 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824106 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824179 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824238 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824278 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824401 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824439 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824438 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824473 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824507 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824537 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824567 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824598 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824631 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824682 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824713 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824749 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824782 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824809 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824843 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824872 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824903 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824934 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824969 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824999 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825030 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825067 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825185 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825221 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825256 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825623 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825718 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825755 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825796 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825837 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825873 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825906 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825944 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825982 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826017 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826049 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826078 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826109 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826147 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826179 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826211 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826243 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826292 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826325 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826357 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826391 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826423 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826456 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826488 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826522 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826555 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826586 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826617 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826681 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826716 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826753 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826787 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826978 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.827022 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.827054 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.827084 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.827113 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.827143 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824463 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824457 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824580 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824731 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824864 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824871 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824907 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824850 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.824971 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.827746 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825298 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825512 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826500 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.828255 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.828308 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.828599 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829068 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829721 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829773 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829807 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829846 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829878 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829914 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829950 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829986 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.830021 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.830055 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.830089 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.830122 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.830156 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826641 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826704 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.826726 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.827086 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.827596 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.825022 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.827939 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.828041 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.828112 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.828597 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.828818 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.828987 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829099 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829367 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829378 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.829626 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.830054 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.830335 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:14:46.33029045 +0000 UTC m=+21.234656640 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.837413 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.837444 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.837728 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.838399 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.838681 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.838872 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.830525 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.830692 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.830553 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.830751 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.831273 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.831406 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.831573 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.831752 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.831762 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.832026 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.832056 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.832049 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.833542 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.833772 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.833938 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.833970 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.834160 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.834386 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.834463 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.839465 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.839889 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.839942 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.839977 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.840015 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.840051 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.840086 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.840119 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.840152 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.840193 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.840231 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.840277 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.840323 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.840364 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.840466 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.834534 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.834641 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.834806 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.834842 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.835187 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.834856 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.835393 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.835595 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.836391 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841118 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841190 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841244 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841261 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841296 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841348 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841387 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841425 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841461 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841495 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841527 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841583 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841618 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841713 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841778 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841816 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841855 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841894 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841932 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.841971 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842012 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842052 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842096 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842139 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842175 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842209 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842247 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842286 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842320 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842358 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842527 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842566 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842609 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842647 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842727 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842764 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842837 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842874 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842913 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.842973 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843007 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843048 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843093 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843138 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843172 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843219 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843271 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843322 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843376 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843430 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843484 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843539 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843591 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843681 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843741 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843796 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843849 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844086 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844152 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844216 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844305 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844361 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844404 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844449 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844492 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844900 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845181 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845728 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845772 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845810 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845848 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845881 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845917 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845956 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846022 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846044 4974 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846063 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846080 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846097 4974 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846117 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846135 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846152 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846170 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846186 4974 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846205 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846222 4974 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846239 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846257 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846273 4974 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846289 4974 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846306 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846321 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846337 4974 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846355 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846373 4974 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846392 4974 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846408 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846423 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846440 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846454 4974 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846470 4974 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846546 4974 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846565 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847180 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847205 4974 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847224 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847243 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847265 4974 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847285 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847302 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847320 4974 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847339 4974 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847356 4974 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847374 4974 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847392 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847410 4974 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847428 4974 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847446 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847463 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847480 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847497 4974 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847514 4974 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847535 4974 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847551 4974 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847566 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847581 4974 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847597 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847613 4974 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847628 4974 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847644 4974 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847690 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847712 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847730 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847748 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847765 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847782 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847799 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847817 4974 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847836 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847860 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847878 4974 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847892 4974 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847908 4974 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847927 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847945 4974 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847962 4974 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847979 4974 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847993 4974 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.848009 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.848025 4974 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.848041 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.848058 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.848725 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.848984 4974 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.849692 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.860026 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.860647 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843052 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843722 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.843814 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844237 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844296 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844335 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844410 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.844949 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.845136 4974 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.861528 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:46.3615017 +0000 UTC m=+21.265867790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.861050 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845303 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845302 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845335 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845673 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845699 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845795 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.845996 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846123 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846153 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846202 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846209 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846742 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.846996 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847094 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847321 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847424 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847638 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.847997 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.848047 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.848392 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.848436 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.849014 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.849007 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.849007 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.849285 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.849536 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.850845 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.850857 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.851329 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.851609 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.852121 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.852621 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.853070 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.853381 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.854488 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.860823 4974 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.862127 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:46.362111956 +0000 UTC m=+21.266478046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.863058 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.863249 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.864094 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.864402 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.864800 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.866083 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.874049 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.874801 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.875203 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.875359 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.875708 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.876284 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.877647 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.878635 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.879445 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.879227 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.879774 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.879849 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.879936 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.880154 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.883393 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.883925 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.884091 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.884780 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.884819 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.884837 4974 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.884899 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:46.384881308 +0000 UTC m=+21.289247398 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.885216 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.886499 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.886908 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.889674 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.889739 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.890066 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.890608 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.890800 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.890830 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.890853 4974 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.890931 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:46.39090813 +0000 UTC m=+21.295274280 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.891242 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.892085 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.892131 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.892230 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.893974 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.894015 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.896306 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.896526 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.896752 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.896758 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.897456 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.897721 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.897925 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.898969 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.899268 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.899388 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.899842 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.901050 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.901089 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.901160 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.901431 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.901455 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.901410 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.901493 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.901769 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.901823 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.901890 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.901854 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.901971 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.902300 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.902335 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.902367 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.902451 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.902560 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.902761 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.904182 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.904535 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.905777 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.905849 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.905881 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.905964 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.906363 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.906415 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.906836 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.907090 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.907298 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.919232 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.921939 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.922068 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.933210 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.934424 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.939226 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.941076 4974 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c" exitCode=255 Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.941143 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c"} Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.943190 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: E1013 18:14:45.949186 4974 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.949725 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.949954 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950002 4974 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950018 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950033 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950094 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950136 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950167 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950189 4974 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950203 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950214 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950226 4974 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950256 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950269 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950281 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950293 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950304 4974 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950318 4974 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950333 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950349 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950365 4974 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950376 4974 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950387 4974 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950398 4974 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950410 4974 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950421 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950432 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950442 4974 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950454 4974 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950466 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950481 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950498 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950889 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950907 4974 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950919 4974 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950921 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950930 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950964 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950977 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950988 4974 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.950999 4974 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951010 4974 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951021 4974 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951032 4974 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951043 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951054 4974 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951064 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951205 4974 scope.go:117] "RemoveContainer" containerID="f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951279 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951512 4974 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951529 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951540 4974 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951551 4974 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951562 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951572 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951583 4974 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951603 4974 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951614 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951625 4974 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951636 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951646 4974 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951694 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951723 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951736 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951751 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951765 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951777 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951788 4974 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951798 4974 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951809 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951819 4974 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951830 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951841 4974 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951852 4974 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951863 4974 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951876 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951888 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951899 4974 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951910 4974 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951920 4974 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951940 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951951 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951963 4974 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951973 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951984 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.951995 4974 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952012 4974 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952028 4974 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952039 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952050 4974 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952060 4974 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952071 4974 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952083 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952093 4974 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952104 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952116 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952127 4974 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952139 4974 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952149 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952160 4974 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952171 4974 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952182 4974 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952192 4974 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952203 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952214 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952225 4974 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952236 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952247 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952259 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952270 4974 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952281 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952292 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952303 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952314 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952325 4974 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952335 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952345 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952359 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952372 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952398 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952412 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.952422 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.962502 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.972812 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.982371 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:45 crc kubenswrapper[4974]: I1013 18:14:45.992695 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.001594 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.011117 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.019949 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.029296 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.038284 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.057757 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.059687 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.070414 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:46 crc kubenswrapper[4974]: W1013 18:14:46.075037 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-10086343a328a1117f49c943900f8d8c3ecb8041e3314e04aaa3f721fa21d75d WatchSource:0}: Error finding container 10086343a328a1117f49c943900f8d8c3ecb8041e3314e04aaa3f721fa21d75d: Status 404 returned error can't find the container with id 10086343a328a1117f49c943900f8d8c3ecb8041e3314e04aaa3f721fa21d75d Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.079896 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.080043 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.091101 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.094362 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Oct 13 18:14:46 crc kubenswrapper[4974]: W1013 18:14:46.097048 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-52f124b8ca026a9216fd0d99ee72786874b7f02d1d7e2a00ee5bea9c47f2f79f WatchSource:0}: Error finding container 52f124b8ca026a9216fd0d99ee72786874b7f02d1d7e2a00ee5bea9c47f2f79f: Status 404 returned error can't find the container with id 52f124b8ca026a9216fd0d99ee72786874b7f02d1d7e2a00ee5bea9c47f2f79f Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.104351 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:46 crc kubenswrapper[4974]: W1013 18:14:46.112904 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-3cd3b59c33ceb2f6c96c3c7af9d8884cfd868ca5204ec67e3c4db877b6974838 WatchSource:0}: Error finding container 3cd3b59c33ceb2f6c96c3c7af9d8884cfd868ca5204ec67e3c4db877b6974838: Status 404 returned error can't find the container with id 3cd3b59c33ceb2f6c96c3c7af9d8884cfd868ca5204ec67e3c4db877b6974838 Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.115407 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.355468 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.355628 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:14:47.355597211 +0000 UTC m=+22.259963281 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.456126 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.456203 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.456245 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.456286 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.456441 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.456466 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.456484 4974 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.456551 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:47.456529335 +0000 UTC m=+22.360895455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.456714 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.456734 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.456751 4974 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.456795 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:47.456781841 +0000 UTC m=+22.361147971 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.456910 4974 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.457011 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:47.456991657 +0000 UTC m=+22.361357737 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.457000 4974 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:14:46 crc kubenswrapper[4974]: E1013 18:14:46.457240 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:47.457165782 +0000 UTC m=+22.361531902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.948808 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.951382 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd"} Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.952019 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.954415 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16"} Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.954498 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6"} Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.954519 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3cd3b59c33ceb2f6c96c3c7af9d8884cfd868ca5204ec67e3c4db877b6974838"} Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.956147 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"52f124b8ca026a9216fd0d99ee72786874b7f02d1d7e2a00ee5bea9c47f2f79f"} Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.958648 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2"} Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.958766 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"10086343a328a1117f49c943900f8d8c3ecb8041e3314e04aaa3f721fa21d75d"} Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.973134 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:46 crc kubenswrapper[4974]: I1013 18:14:46.994793 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:46Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.026211 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.041338 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.065672 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.080983 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.095358 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.112088 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.130255 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.145767 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.169757 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.185613 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.222957 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.242446 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.264634 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.279127 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.364706 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.365147 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:14:49.36510802 +0000 UTC m=+24.269474100 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.465366 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.465417 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.465438 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.465464 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.465608 4974 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.465640 4974 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.465704 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.465723 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.465736 4974 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.465753 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:49.465727605 +0000 UTC m=+24.370093685 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.465624 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.465783 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:49.465771106 +0000 UTC m=+24.370137396 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.465793 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.465808 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:49.465798137 +0000 UTC m=+24.370164227 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.465811 4974 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.465870 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:49.465845118 +0000 UTC m=+24.370211198 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.810789 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.810833 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.810885 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.810971 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.811181 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:14:47 crc kubenswrapper[4974]: E1013 18:14:47.811254 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.815468 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.816410 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.817178 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.818426 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.820515 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.821325 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.822140 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.822984 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.824944 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.825672 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.826325 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.827643 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.828351 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.829562 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.830280 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.832120 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.832853 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.833325 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.836643 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.837529 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.838187 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.839860 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.840400 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.841797 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.842304 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.843625 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.844590 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.845739 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.846470 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.847073 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.848151 4974 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.848291 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.850402 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.851583 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.852140 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.854272 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.856225 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.856825 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.858117 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.859406 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.860511 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.861338 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.862549 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.863371 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.864459 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.865232 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.866195 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.867031 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.868068 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.868611 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.869535 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.870116 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.870689 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.871567 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.904877 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-xpb6b"] Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.905382 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-r5fj5"] Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.905527 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.906050 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r5fj5" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.909041 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.909124 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.909227 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.909764 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.909857 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.910035 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.910157 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.913026 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-98z75"] Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.913401 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-98z75" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.914572 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zwcs8"] Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.915243 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.916291 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-gwv4w"] Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.916998 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xcspx"] Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.917235 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.917242 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.922111 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.922286 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.922505 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.922532 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.922830 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.923685 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.923628 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.923795 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.923704 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.923901 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.923923 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.924079 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.924098 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.924101 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.924183 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.924271 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.924199 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.924404 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.924459 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.955297 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.969961 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-cnibin\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970011 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-var-lib-kubelet\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970039 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-multus-socket-dir-parent\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970066 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-multus-daemon-config\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970091 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5zsg\" (UniqueName: \"kubernetes.io/projected/6b2a80de-225e-4b5a-93fa-a05e3524db4e-kube-api-access-z5zsg\") pod \"node-ca-98z75\" (UID: \"6b2a80de-225e-4b5a-93fa-a05e3524db4e\") " pod="openshift-image-registry/node-ca-98z75" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970119 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-systemd-units\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970147 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-node-log\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970167 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-cni-binary-copy\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970188 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/05ac96ec-aee9-4f1d-868c-6f2252c021bb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970242 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-var-lib-openvswitch\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970266 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/013d968f-6cef-476b-a6fc-88d396bd5cd1-rootfs\") pod \"machine-config-daemon-xpb6b\" (UID: \"013d968f-6cef-476b-a6fc-88d396bd5cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970290 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-kubelet\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970344 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b2a80de-225e-4b5a-93fa-a05e3524db4e-host\") pod \"node-ca-98z75\" (UID: \"6b2a80de-225e-4b5a-93fa-a05e3524db4e\") " pod="openshift-image-registry/node-ca-98z75" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970384 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6b2a80de-225e-4b5a-93fa-a05e3524db4e-serviceca\") pod \"node-ca-98z75\" (UID: \"6b2a80de-225e-4b5a-93fa-a05e3524db4e\") " pod="openshift-image-registry/node-ca-98z75" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970405 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/05ac96ec-aee9-4f1d-868c-6f2252c021bb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970429 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970450 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-run-k8s-cni-cncf-io\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970471 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-run-netns\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970487 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-etc-openvswitch\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970506 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw9b5\" (UniqueName: \"kubernetes.io/projected/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-kube-api-access-nw9b5\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970525 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/05ac96ec-aee9-4f1d-868c-6f2252c021bb-cnibin\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970546 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-run-netns\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970560 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-hostroot\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970587 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-run-multus-certs\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970603 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/05ac96ec-aee9-4f1d-868c-6f2252c021bb-system-cni-dir\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970622 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-openvswitch\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970639 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-cni-netd\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970694 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovnkube-script-lib\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970714 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccpr7\" (UniqueName: \"kubernetes.io/projected/05ac96ec-aee9-4f1d-868c-6f2252c021bb-kube-api-access-ccpr7\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970731 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-system-cni-dir\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970745 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-multus-conf-dir\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970790 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-slash\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970806 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-ovn\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970824 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/05ac96ec-aee9-4f1d-868c-6f2252c021bb-cni-binary-copy\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970841 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cddd1702-fcd5-40f0-97fc-61eb59192de8-hosts-file\") pod \"node-resolver-r5fj5\" (UID: \"cddd1702-fcd5-40f0-97fc-61eb59192de8\") " pod="openshift-dns/node-resolver-r5fj5" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970914 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970933 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8nch\" (UniqueName: \"kubernetes.io/projected/d9f54cc7-5b3b-4481-9be5-f03df1854435-kube-api-access-s8nch\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970952 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-os-release\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970970 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-etc-kubernetes\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.970989 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-log-socket\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971008 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovn-node-metrics-cert\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971024 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-systemd\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971041 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/05ac96ec-aee9-4f1d-868c-6f2252c021bb-os-release\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971071 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klxdc\" (UniqueName: \"kubernetes.io/projected/013d968f-6cef-476b-a6fc-88d396bd5cd1-kube-api-access-klxdc\") pod \"machine-config-daemon-xpb6b\" (UID: \"013d968f-6cef-476b-a6fc-88d396bd5cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971092 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/013d968f-6cef-476b-a6fc-88d396bd5cd1-mcd-auth-proxy-config\") pod \"machine-config-daemon-xpb6b\" (UID: \"013d968f-6cef-476b-a6fc-88d396bd5cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971110 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-cni-bin\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971130 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-multus-cni-dir\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971148 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-var-lib-cni-bin\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971166 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-var-lib-cni-multus\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971194 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/013d968f-6cef-476b-a6fc-88d396bd5cd1-proxy-tls\") pod \"machine-config-daemon-xpb6b\" (UID: \"013d968f-6cef-476b-a6fc-88d396bd5cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971215 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovnkube-config\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971236 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-env-overrides\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.971259 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27gl5\" (UniqueName: \"kubernetes.io/projected/cddd1702-fcd5-40f0-97fc-61eb59192de8-kube-api-access-27gl5\") pod \"node-resolver-r5fj5\" (UID: \"cddd1702-fcd5-40f0-97fc-61eb59192de8\") " pod="openshift-dns/node-resolver-r5fj5" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.980597 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:47 crc kubenswrapper[4974]: I1013 18:14:47.996904 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:47Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.026726 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.055984 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.072904 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovnkube-script-lib\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.072953 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccpr7\" (UniqueName: \"kubernetes.io/projected/05ac96ec-aee9-4f1d-868c-6f2252c021bb-kube-api-access-ccpr7\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.072987 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-system-cni-dir\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073010 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-multus-conf-dir\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073031 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-slash\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073060 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-ovn\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073132 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-system-cni-dir\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073191 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-slash\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073186 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-multus-conf-dir\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073287 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-ovn\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073460 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073491 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8nch\" (UniqueName: \"kubernetes.io/projected/d9f54cc7-5b3b-4481-9be5-f03df1854435-kube-api-access-s8nch\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073518 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/05ac96ec-aee9-4f1d-868c-6f2252c021bb-cni-binary-copy\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073562 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cddd1702-fcd5-40f0-97fc-61eb59192de8-hosts-file\") pod \"node-resolver-r5fj5\" (UID: \"cddd1702-fcd5-40f0-97fc-61eb59192de8\") " pod="openshift-dns/node-resolver-r5fj5" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073591 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-log-socket\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073621 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovn-node-metrics-cert\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073645 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-os-release\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073687 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-etc-kubernetes\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073728 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klxdc\" (UniqueName: \"kubernetes.io/projected/013d968f-6cef-476b-a6fc-88d396bd5cd1-kube-api-access-klxdc\") pod \"machine-config-daemon-xpb6b\" (UID: \"013d968f-6cef-476b-a6fc-88d396bd5cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073755 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-systemd\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073781 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/05ac96ec-aee9-4f1d-868c-6f2252c021bb-os-release\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073806 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-multus-cni-dir\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073830 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-var-lib-cni-bin\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073854 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-var-lib-cni-multus\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073878 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/013d968f-6cef-476b-a6fc-88d396bd5cd1-proxy-tls\") pod \"machine-config-daemon-xpb6b\" (UID: \"013d968f-6cef-476b-a6fc-88d396bd5cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073905 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/013d968f-6cef-476b-a6fc-88d396bd5cd1-mcd-auth-proxy-config\") pod \"machine-config-daemon-xpb6b\" (UID: \"013d968f-6cef-476b-a6fc-88d396bd5cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073931 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-cni-bin\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073958 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-env-overrides\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073953 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovnkube-script-lib\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.073982 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27gl5\" (UniqueName: \"kubernetes.io/projected/cddd1702-fcd5-40f0-97fc-61eb59192de8-kube-api-access-27gl5\") pod \"node-resolver-r5fj5\" (UID: \"cddd1702-fcd5-40f0-97fc-61eb59192de8\") " pod="openshift-dns/node-resolver-r5fj5" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074077 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovnkube-config\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074111 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-cnibin\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074142 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-var-lib-kubelet\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074173 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5zsg\" (UniqueName: \"kubernetes.io/projected/6b2a80de-225e-4b5a-93fa-a05e3524db4e-kube-api-access-z5zsg\") pod \"node-ca-98z75\" (UID: \"6b2a80de-225e-4b5a-93fa-a05e3524db4e\") " pod="openshift-image-registry/node-ca-98z75" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074194 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-systemd-units\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074225 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-multus-socket-dir-parent\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074237 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-run-ovn-kubernetes\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074249 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-multus-daemon-config\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074300 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/05ac96ec-aee9-4f1d-868c-6f2252c021bb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074334 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-var-lib-openvswitch\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074357 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-node-log\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074381 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-cni-binary-copy\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074403 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b2a80de-225e-4b5a-93fa-a05e3524db4e-host\") pod \"node-ca-98z75\" (UID: \"6b2a80de-225e-4b5a-93fa-a05e3524db4e\") " pod="openshift-image-registry/node-ca-98z75" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074424 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6b2a80de-225e-4b5a-93fa-a05e3524db4e-serviceca\") pod \"node-ca-98z75\" (UID: \"6b2a80de-225e-4b5a-93fa-a05e3524db4e\") " pod="openshift-image-registry/node-ca-98z75" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074448 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/05ac96ec-aee9-4f1d-868c-6f2252c021bb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074473 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/013d968f-6cef-476b-a6fc-88d396bd5cd1-rootfs\") pod \"machine-config-daemon-xpb6b\" (UID: \"013d968f-6cef-476b-a6fc-88d396bd5cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074496 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-kubelet\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074520 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074545 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-run-k8s-cni-cncf-io\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074574 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw9b5\" (UniqueName: \"kubernetes.io/projected/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-kube-api-access-nw9b5\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074599 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/05ac96ec-aee9-4f1d-868c-6f2252c021bb-cnibin\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074623 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-run-netns\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074664 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-etc-openvswitch\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074696 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-run-multus-certs\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074720 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/05ac96ec-aee9-4f1d-868c-6f2252c021bb-system-cni-dir\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074745 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-openvswitch\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074769 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-cni-netd\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074796 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-run-netns\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074822 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-hostroot\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074840 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-multus-daemon-config\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.074890 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-hostroot\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.075315 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovnkube-config\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.075376 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-cnibin\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.075404 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-var-lib-kubelet\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.075646 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-node-log\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.075745 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/05ac96ec-aee9-4f1d-868c-6f2252c021bb-cnibin\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.075763 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-log-socket\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.075712 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-run-multus-certs\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.075736 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/cddd1702-fcd5-40f0-97fc-61eb59192de8-hosts-file\") pod \"node-resolver-r5fj5\" (UID: \"cddd1702-fcd5-40f0-97fc-61eb59192de8\") " pod="openshift-dns/node-resolver-r5fj5" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.075740 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-multus-socket-dir-parent\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.075689 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-multus-cni-dir\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.075671 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-systemd-units\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076118 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-openvswitch\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076220 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/05ac96ec-aee9-4f1d-868c-6f2252c021bb-cni-binary-copy\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076371 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/013d968f-6cef-476b-a6fc-88d396bd5cd1-mcd-auth-proxy-config\") pod \"machine-config-daemon-xpb6b\" (UID: \"013d968f-6cef-476b-a6fc-88d396bd5cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076393 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-var-lib-openvswitch\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076561 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-run-netns\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076685 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-run-k8s-cni-cncf-io\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076785 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-os-release\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076827 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-systemd\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076849 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6b2a80de-225e-4b5a-93fa-a05e3524db4e-host\") pod \"node-ca-98z75\" (UID: \"6b2a80de-225e-4b5a-93fa-a05e3524db4e\") " pod="openshift-image-registry/node-ca-98z75" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076880 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/05ac96ec-aee9-4f1d-868c-6f2252c021bb-os-release\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076901 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/05ac96ec-aee9-4f1d-868c-6f2252c021bb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076947 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-etc-openvswitch\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076975 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-etc-kubernetes\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076995 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-cni-netd\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.076799 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/013d968f-6cef-476b-a6fc-88d396bd5cd1-rootfs\") pod \"machine-config-daemon-xpb6b\" (UID: \"013d968f-6cef-476b-a6fc-88d396bd5cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.077148 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/05ac96ec-aee9-4f1d-868c-6f2252c021bb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.077168 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/05ac96ec-aee9-4f1d-868c-6f2252c021bb-system-cni-dir\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.077187 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-var-lib-cni-multus\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.077204 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-run-netns\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.077461 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-env-overrides\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.077588 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-kubelet\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.077698 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.077741 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6b2a80de-225e-4b5a-93fa-a05e3524db4e-serviceca\") pod \"node-ca-98z75\" (UID: \"6b2a80de-225e-4b5a-93fa-a05e3524db4e\") " pod="openshift-image-registry/node-ca-98z75" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.077876 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-cni-bin\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.078054 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-host-var-lib-cni-bin\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.078156 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-cni-binary-copy\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.087625 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovn-node-metrics-cert\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.088993 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.094235 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/013d968f-6cef-476b-a6fc-88d396bd5cd1-proxy-tls\") pod \"machine-config-daemon-xpb6b\" (UID: \"013d968f-6cef-476b-a6fc-88d396bd5cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.103050 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccpr7\" (UniqueName: \"kubernetes.io/projected/05ac96ec-aee9-4f1d-868c-6f2252c021bb-kube-api-access-ccpr7\") pod \"multus-additional-cni-plugins-gwv4w\" (UID: \"05ac96ec-aee9-4f1d-868c-6f2252c021bb\") " pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.103297 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27gl5\" (UniqueName: \"kubernetes.io/projected/cddd1702-fcd5-40f0-97fc-61eb59192de8-kube-api-access-27gl5\") pod \"node-resolver-r5fj5\" (UID: \"cddd1702-fcd5-40f0-97fc-61eb59192de8\") " pod="openshift-dns/node-resolver-r5fj5" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.105130 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klxdc\" (UniqueName: \"kubernetes.io/projected/013d968f-6cef-476b-a6fc-88d396bd5cd1-kube-api-access-klxdc\") pod \"machine-config-daemon-xpb6b\" (UID: \"013d968f-6cef-476b-a6fc-88d396bd5cd1\") " pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.107349 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw9b5\" (UniqueName: \"kubernetes.io/projected/9c38c0e3-9bee-402b-adf0-27ac9e31c0f0-kube-api-access-nw9b5\") pod \"multus-xcspx\" (UID: \"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\") " pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.111938 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5zsg\" (UniqueName: \"kubernetes.io/projected/6b2a80de-225e-4b5a-93fa-a05e3524db4e-kube-api-access-z5zsg\") pod \"node-ca-98z75\" (UID: \"6b2a80de-225e-4b5a-93fa-a05e3524db4e\") " pod="openshift-image-registry/node-ca-98z75" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.114550 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8nch\" (UniqueName: \"kubernetes.io/projected/d9f54cc7-5b3b-4481-9be5-f03df1854435-kube-api-access-s8nch\") pod \"ovnkube-node-zwcs8\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.114782 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.129871 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.144761 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.164947 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.180344 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.197024 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.214742 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.219671 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.231574 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-r5fj5" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.233119 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.236634 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.242877 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-98z75" Oct 13 18:14:48 crc kubenswrapper[4974]: W1013 18:14:48.244901 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcddd1702_fcd5_40f0_97fc_61eb59192de8.slice/crio-58263d3d9cd0d7f2a4376ef6d08a620adcfa98d539c1f08c09f70bc10d03e18d WatchSource:0}: Error finding container 58263d3d9cd0d7f2a4376ef6d08a620adcfa98d539c1f08c09f70bc10d03e18d: Status 404 returned error can't find the container with id 58263d3d9cd0d7f2a4376ef6d08a620adcfa98d539c1f08c09f70bc10d03e18d Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.251717 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.255644 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.263707 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xcspx" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.267967 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.290608 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.300499 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Oct 13 18:14:48 crc kubenswrapper[4974]: W1013 18:14:48.303975 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05ac96ec_aee9_4f1d_868c_6f2252c021bb.slice/crio-678247d8f583a24361eb83bf43a0ea1b71289151dbab2f02eda7ac92e389d6e8 WatchSource:0}: Error finding container 678247d8f583a24361eb83bf43a0ea1b71289151dbab2f02eda7ac92e389d6e8: Status 404 returned error can't find the container with id 678247d8f583a24361eb83bf43a0ea1b71289151dbab2f02eda7ac92e389d6e8 Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.313528 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.328723 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.344636 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.363155 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.377577 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.394106 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.408834 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.425031 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.451252 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.471251 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.486763 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.507032 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.521925 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.534236 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.564110 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.598484 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.630621 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.642964 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.654069 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.664860 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.677879 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.691014 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.707597 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.966543 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r5fj5" event={"ID":"cddd1702-fcd5-40f0-97fc-61eb59192de8","Type":"ContainerStarted","Data":"fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.967102 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-r5fj5" event={"ID":"cddd1702-fcd5-40f0-97fc-61eb59192de8","Type":"ContainerStarted","Data":"58263d3d9cd0d7f2a4376ef6d08a620adcfa98d539c1f08c09f70bc10d03e18d"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.968620 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.968710 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.968726 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"084f1302c52bd4ba002ca1f81f4af77354f0355adccad762b9e095dbdbc948a8"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.969974 4974 generic.go:334] "Generic (PLEG): container finished" podID="05ac96ec-aee9-4f1d-868c-6f2252c021bb" containerID="55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6" exitCode=0 Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.970056 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" event={"ID":"05ac96ec-aee9-4f1d-868c-6f2252c021bb","Type":"ContainerDied","Data":"55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.970226 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" event={"ID":"05ac96ec-aee9-4f1d-868c-6f2252c021bb","Type":"ContainerStarted","Data":"678247d8f583a24361eb83bf43a0ea1b71289151dbab2f02eda7ac92e389d6e8"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.971604 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7" exitCode=0 Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.971646 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.971682 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"8dec09712400c9a59fc9f1746ad1c613e0ad3bf489988bf2ff3b84803b6a0e4d"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.973055 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-98z75" event={"ID":"6b2a80de-225e-4b5a-93fa-a05e3524db4e","Type":"ContainerStarted","Data":"3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.973102 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-98z75" event={"ID":"6b2a80de-225e-4b5a-93fa-a05e3524db4e","Type":"ContainerStarted","Data":"283f3ad933e0a255e50965c7e7129a09f507a54865e7750101a4b33d4b091880"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.978006 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.980130 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xcspx" event={"ID":"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0","Type":"ContainerStarted","Data":"e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.980272 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xcspx" event={"ID":"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0","Type":"ContainerStarted","Data":"d65d41ce41611d3a3e223e7fb87c17a77c739d14c875bfb3e0e6c14c1f76ddeb"} Oct 13 18:14:48 crc kubenswrapper[4974]: I1013 18:14:48.986182 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:48Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.006426 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.032791 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.050642 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.066871 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.080146 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.095487 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.114951 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.159719 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.182050 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.243182 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.267619 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.300769 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.313872 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.325356 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.339862 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.356255 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.388959 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.389242 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:14:53.389212914 +0000 UTC m=+28.293578994 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.390690 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.408374 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.422742 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.443260 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.458751 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.489477 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.489892 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.490069 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.490355 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.490445 4974 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.490569 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:53.490546248 +0000 UTC m=+28.394912338 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.490224 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.490804 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.490941 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.491188 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.491269 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.491337 4974 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.491426 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:53.491415261 +0000 UTC m=+28.395781351 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.490386 4974 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.491605 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:53.491593076 +0000 UTC m=+28.395959166 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.491739 4974 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.491837 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:14:53.491826933 +0000 UTC m=+28.396193023 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.510170 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.526841 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.542727 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.560837 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.575859 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.595671 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.607554 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:49Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.810580 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.810716 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.810738 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.810808 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.810903 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:14:49 crc kubenswrapper[4974]: E1013 18:14:49.811056 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.987753 4974 generic.go:334] "Generic (PLEG): container finished" podID="05ac96ec-aee9-4f1d-868c-6f2252c021bb" containerID="6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6" exitCode=0 Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.987826 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" event={"ID":"05ac96ec-aee9-4f1d-868c-6f2252c021bb","Type":"ContainerDied","Data":"6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6"} Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.994717 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f"} Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.994798 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b"} Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.994816 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f"} Oct 13 18:14:49 crc kubenswrapper[4974]: I1013 18:14:49.994834 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e"} Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.029463 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.046192 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.064597 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.088245 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.108887 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.128863 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.154178 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.171721 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.187097 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.203770 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.217961 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.238085 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.252759 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.269982 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:50 crc kubenswrapper[4974]: I1013 18:14:50.291978 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:50Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.001332 4974 generic.go:334] "Generic (PLEG): container finished" podID="05ac96ec-aee9-4f1d-868c-6f2252c021bb" containerID="83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3" exitCode=0 Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.001416 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" event={"ID":"05ac96ec-aee9-4f1d-868c-6f2252c021bb","Type":"ContainerDied","Data":"83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3"} Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.010972 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941"} Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.011025 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73"} Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.022150 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.039249 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.053895 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.068633 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.082316 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.100546 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.117316 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.132862 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.146327 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.161199 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.175802 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.189311 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.201148 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.212532 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.233866 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:51Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.810701 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.810722 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:51 crc kubenswrapper[4974]: I1013 18:14:51.810772 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:51 crc kubenswrapper[4974]: E1013 18:14:51.810996 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:14:51 crc kubenswrapper[4974]: E1013 18:14:51.811097 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:14:51 crc kubenswrapper[4974]: E1013 18:14:51.811184 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.017758 4974 generic.go:334] "Generic (PLEG): container finished" podID="05ac96ec-aee9-4f1d-868c-6f2252c021bb" containerID="e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd" exitCode=0 Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.017829 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" event={"ID":"05ac96ec-aee9-4f1d-868c-6f2252c021bb","Type":"ContainerDied","Data":"e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.024257 4974 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.027804 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.027866 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.027883 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.028058 4974 kubelet_node_status.go:76] "Attempting to register node" node="crc" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.037769 4974 kubelet_node_status.go:115] "Node was previously registered" node="crc" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.038114 4974 kubelet_node_status.go:79] "Successfully registered node" node="crc" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.039622 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.039674 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.039691 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.039709 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.039722 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.044495 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.065574 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: E1013 18:14:52.067177 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.072486 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.072575 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.072595 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.072692 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.072714 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.078462 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: E1013 18:14:52.087872 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.089349 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.093790 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.093912 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.093935 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.094004 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.094026 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.107097 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: E1013 18:14:52.110273 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.122865 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.122904 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.122914 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.122930 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.122939 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.133554 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: E1013 18:14:52.137826 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.142276 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.142336 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.142360 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.142385 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.142402 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.155103 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: E1013 18:14:52.158578 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: E1013 18:14:52.158715 4974 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.162990 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.163018 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.163028 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.163044 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.163055 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.166607 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.180511 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.193807 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.209324 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.225931 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.242811 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.256509 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.264967 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.264993 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.265018 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.265032 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.265042 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.269416 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:52Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.367072 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.367134 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.367156 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.367184 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.367206 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.470211 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.470263 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.470274 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.470291 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.470305 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.573299 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.573359 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.573377 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.573401 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.573419 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.676647 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.676740 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.676758 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.676785 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.676803 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.779239 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.779302 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.779319 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.779343 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.779364 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.883081 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.883139 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.883158 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.883181 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.883201 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.986825 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.986886 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.986905 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.986930 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:52 crc kubenswrapper[4974]: I1013 18:14:52.986951 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:52Z","lastTransitionTime":"2025-10-13T18:14:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.025212 4974 generic.go:334] "Generic (PLEG): container finished" podID="05ac96ec-aee9-4f1d-868c-6f2252c021bb" containerID="6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97" exitCode=0 Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.025320 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" event={"ID":"05ac96ec-aee9-4f1d-868c-6f2252c021bb","Type":"ContainerDied","Data":"6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97"} Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.042582 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd"} Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.052969 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.081218 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.089717 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.089789 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.089812 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.089842 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.089864 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:53Z","lastTransitionTime":"2025-10-13T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.104323 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.123369 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.141403 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.175927 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.195609 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.197237 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.197275 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.197292 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.197317 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.197335 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:53Z","lastTransitionTime":"2025-10-13T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.219456 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.235482 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.270742 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.292090 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.300969 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.301017 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.301031 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.301050 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.301062 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:53Z","lastTransitionTime":"2025-10-13T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.310045 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.328302 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.343988 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.363551 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.403800 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.403844 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.403857 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.403877 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.403893 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:53Z","lastTransitionTime":"2025-10-13T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.431178 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.431476 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:15:01.431451869 +0000 UTC m=+36.335817989 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.507390 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.507446 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.507463 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.507489 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.507511 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:53Z","lastTransitionTime":"2025-10-13T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.532389 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.532480 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.532526 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.532561 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.532812 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.532847 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.532867 4974 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.532947 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 18:15:01.532924007 +0000 UTC m=+36.437290127 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.533069 4974 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.533197 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:15:01.533168784 +0000 UTC m=+36.437534904 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.533241 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.533307 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.533326 4974 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.533363 4974 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.533404 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 18:15:01.53337803 +0000 UTC m=+36.437744210 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.533553 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:15:01.533533824 +0000 UTC m=+36.437900014 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.610887 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.610934 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.610950 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.610975 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.610991 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:53Z","lastTransitionTime":"2025-10-13T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.714916 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.714995 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.715022 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.715057 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.715080 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:53Z","lastTransitionTime":"2025-10-13T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.811359 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.811391 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.811640 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.811746 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.811901 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:14:53 crc kubenswrapper[4974]: E1013 18:14:53.811979 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.818207 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.818282 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.818303 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.818336 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.818354 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:53Z","lastTransitionTime":"2025-10-13T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.921748 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.922048 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.922085 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.922116 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:53 crc kubenswrapper[4974]: I1013 18:14:53.922137 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:53Z","lastTransitionTime":"2025-10-13T18:14:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.025994 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.026066 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.026090 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.026122 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.026145 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:54Z","lastTransitionTime":"2025-10-13T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.051947 4974 generic.go:334] "Generic (PLEG): container finished" podID="05ac96ec-aee9-4f1d-868c-6f2252c021bb" containerID="ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f" exitCode=0 Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.052038 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" event={"ID":"05ac96ec-aee9-4f1d-868c-6f2252c021bb","Type":"ContainerDied","Data":"ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f"} Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.080258 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.095965 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.116793 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.130326 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.130380 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.130394 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.130414 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.130427 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:54Z","lastTransitionTime":"2025-10-13T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.134365 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.169145 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.196677 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.217041 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.238284 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.238319 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.238340 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.238354 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.238363 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:54Z","lastTransitionTime":"2025-10-13T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.243389 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.275876 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.289605 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.302870 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.316310 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.328957 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.341016 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.341060 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.341072 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.341091 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.341101 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:54Z","lastTransitionTime":"2025-10-13T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.342594 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.360104 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:54Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.443241 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.443313 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.443330 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.443353 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.443369 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:54Z","lastTransitionTime":"2025-10-13T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.552311 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.552359 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.552372 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.552389 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.552401 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:54Z","lastTransitionTime":"2025-10-13T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.655430 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.655474 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.655484 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.655502 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.655515 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:54Z","lastTransitionTime":"2025-10-13T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.758460 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.758495 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.758515 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.758534 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.758546 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:54Z","lastTransitionTime":"2025-10-13T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.862880 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.862915 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.862924 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.862943 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.862971 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:54Z","lastTransitionTime":"2025-10-13T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.966461 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.966493 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.966502 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.966520 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:54 crc kubenswrapper[4974]: I1013 18:14:54.966534 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:54Z","lastTransitionTime":"2025-10-13T18:14:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.061417 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" event={"ID":"05ac96ec-aee9-4f1d-868c-6f2252c021bb","Type":"ContainerStarted","Data":"79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030"} Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.068398 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.068433 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.068443 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.068460 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.068471 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:55Z","lastTransitionTime":"2025-10-13T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.069202 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111"} Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.070271 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.070526 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.081032 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.095925 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.101525 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.107324 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.114236 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.143483 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.160598 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.172132 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.172169 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.172179 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.172196 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.172208 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:55Z","lastTransitionTime":"2025-10-13T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.181039 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.191865 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.212446 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.229200 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.246487 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.259214 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.273908 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.275081 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.275124 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.275139 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.275158 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.275170 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:55Z","lastTransitionTime":"2025-10-13T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.287737 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.306764 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.322928 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.342555 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.359282 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.374142 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.378859 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.378941 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.378963 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.378995 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.379025 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:55Z","lastTransitionTime":"2025-10-13T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.392202 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.407777 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.424688 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.444885 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.463555 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.482059 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.482251 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.482331 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.482423 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.482501 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:55Z","lastTransitionTime":"2025-10-13T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.482770 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.497895 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.509275 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.528311 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.556391 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.572881 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.585148 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.585579 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.585947 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.586173 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.586372 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:55Z","lastTransitionTime":"2025-10-13T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.586969 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.694428 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.694480 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.694494 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.694515 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.694535 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:55Z","lastTransitionTime":"2025-10-13T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.797379 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.797417 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.797426 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.797441 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.797451 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:55Z","lastTransitionTime":"2025-10-13T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.810630 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.810735 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.810748 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:55 crc kubenswrapper[4974]: E1013 18:14:55.810840 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:14:55 crc kubenswrapper[4974]: E1013 18:14:55.811087 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:14:55 crc kubenswrapper[4974]: E1013 18:14:55.811194 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.837720 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.854530 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.871031 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.899841 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.899903 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.899921 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.899946 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.899965 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:55Z","lastTransitionTime":"2025-10-13T18:14:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.906890 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.925257 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.948259 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.964077 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:55 crc kubenswrapper[4974]: I1013 18:14:55.997201 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.004051 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.004158 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.004185 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.004220 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.004247 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:56Z","lastTransitionTime":"2025-10-13T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.015870 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.041038 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.059136 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.072308 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.077497 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.094567 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.108239 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.108297 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.108308 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.108324 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.108335 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:56Z","lastTransitionTime":"2025-10-13T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.123617 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.142805 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.211749 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.211804 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.211821 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.211845 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.211862 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:56Z","lastTransitionTime":"2025-10-13T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.314673 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.314714 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.314723 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.314754 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.314766 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:56Z","lastTransitionTime":"2025-10-13T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.417976 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.418037 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.418054 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.418080 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.418099 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:56Z","lastTransitionTime":"2025-10-13T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.521175 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.521220 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.521234 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.521248 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.521258 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:56Z","lastTransitionTime":"2025-10-13T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.624247 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.624601 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.624722 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.624850 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.624928 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:56Z","lastTransitionTime":"2025-10-13T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.727297 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.727336 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.727347 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.727364 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.727376 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:56Z","lastTransitionTime":"2025-10-13T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.849325 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.849362 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.849373 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.849388 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.849400 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:56Z","lastTransitionTime":"2025-10-13T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.951974 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.952006 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.952013 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.952028 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:56 crc kubenswrapper[4974]: I1013 18:14:56.952038 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:56Z","lastTransitionTime":"2025-10-13T18:14:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.054004 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.054042 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.054055 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.054070 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.054080 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:57Z","lastTransitionTime":"2025-10-13T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.074426 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.157393 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.157457 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.157477 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.157502 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.157519 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:57Z","lastTransitionTime":"2025-10-13T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.260124 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.260178 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.260189 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.260205 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.260217 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:57Z","lastTransitionTime":"2025-10-13T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.362259 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.362292 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.362302 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.362319 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.362330 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:57Z","lastTransitionTime":"2025-10-13T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.465117 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.465181 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.465203 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.465232 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.465256 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:57Z","lastTransitionTime":"2025-10-13T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.568321 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.568374 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.568397 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.568422 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.568440 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:57Z","lastTransitionTime":"2025-10-13T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.671532 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.671591 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.671611 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.671635 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.671691 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:57Z","lastTransitionTime":"2025-10-13T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.775226 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.775328 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.775348 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.775374 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.775774 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:57Z","lastTransitionTime":"2025-10-13T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.811391 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.811400 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.811488 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:57 crc kubenswrapper[4974]: E1013 18:14:57.812183 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:14:57 crc kubenswrapper[4974]: E1013 18:14:57.812368 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:14:57 crc kubenswrapper[4974]: E1013 18:14:57.812586 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.878961 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.879038 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.879058 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.879081 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.879098 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:57Z","lastTransitionTime":"2025-10-13T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.981599 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.981690 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.981710 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.981734 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:57 crc kubenswrapper[4974]: I1013 18:14:57.981752 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:57Z","lastTransitionTime":"2025-10-13T18:14:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.081272 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/0.log" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.084126 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.084199 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.084223 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.084251 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.084273 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:58Z","lastTransitionTime":"2025-10-13T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.086192 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111" exitCode=1 Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.086254 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111"} Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.087458 4974 scope.go:117] "RemoveContainer" containerID="bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.125303 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.145471 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.161948 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.186206 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.187301 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.187515 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.187685 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.187821 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.187976 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:58Z","lastTransitionTime":"2025-10-13T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.206023 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.222584 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.242158 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.265289 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.284742 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.291390 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.291419 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.291430 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.291448 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.291460 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:58Z","lastTransitionTime":"2025-10-13T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.303545 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.319829 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.334350 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.346903 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.357906 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.385382 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:57Z\\\",\\\"message\\\":\\\" 6328 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1013 18:14:57.469292 6328 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:14:57.469370 6328 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:57.469413 6328 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:14:57.469423 6328 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:14:57.469441 6328 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:57.469452 6328 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:14:57.469488 6328 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:57.469525 6328 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:14:57.469528 6328 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:57.469553 6328 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:57.469541 6328 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:57.469555 6328 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1013 18:14:57.469584 6328 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:57.469608 6328 factory.go:656] Stopping watch factory\\\\nI1013 18:14:57.469627 6328 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:58Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.394385 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.394430 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.394446 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.394516 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.394532 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:58Z","lastTransitionTime":"2025-10-13T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.497503 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.497588 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.497608 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.497635 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.497676 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:58Z","lastTransitionTime":"2025-10-13T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.600271 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.600322 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.600335 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.600352 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.600365 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:58Z","lastTransitionTime":"2025-10-13T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.703140 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.703176 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.703193 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.703214 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.703226 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:58Z","lastTransitionTime":"2025-10-13T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.806017 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.806063 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.806074 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.806095 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.806107 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:58Z","lastTransitionTime":"2025-10-13T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.909024 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.909071 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.909082 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.909100 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:58 crc kubenswrapper[4974]: I1013 18:14:58.909113 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:58Z","lastTransitionTime":"2025-10-13T18:14:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.012155 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.012199 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.012212 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.012230 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.012242 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:59Z","lastTransitionTime":"2025-10-13T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.091709 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/0.log" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.095036 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5"} Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.095147 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.136944 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.138613 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.138711 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.138727 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.138752 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.138764 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:59Z","lastTransitionTime":"2025-10-13T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.156377 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.178334 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.197503 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.228994 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:57Z\\\",\\\"message\\\":\\\" 6328 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1013 18:14:57.469292 6328 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:14:57.469370 6328 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:57.469413 6328 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:14:57.469423 6328 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:14:57.469441 6328 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:57.469452 6328 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:14:57.469488 6328 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:57.469525 6328 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:14:57.469528 6328 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:57.469553 6328 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:57.469541 6328 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:57.469555 6328 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1013 18:14:57.469584 6328 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:57.469608 6328 factory.go:656] Stopping watch factory\\\\nI1013 18:14:57.469627 6328 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.241461 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.241527 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.241547 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.241573 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.241595 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:59Z","lastTransitionTime":"2025-10-13T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.245021 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.260198 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.276803 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.299745 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.312477 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.324897 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.339224 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.343366 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.343423 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.343440 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.343462 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.343478 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:59Z","lastTransitionTime":"2025-10-13T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.371310 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.387808 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.408783 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.447187 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.447222 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.447234 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.447251 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.447264 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:59Z","lastTransitionTime":"2025-10-13T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.550312 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.550372 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.550391 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.550416 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.550473 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:59Z","lastTransitionTime":"2025-10-13T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.654798 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.654881 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.654905 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.654935 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.654957 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:59Z","lastTransitionTime":"2025-10-13T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.758301 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.758368 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.758387 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.758413 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.758431 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:59Z","lastTransitionTime":"2025-10-13T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.810946 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:14:59 crc kubenswrapper[4974]: E1013 18:14:59.811100 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.811405 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:14:59 crc kubenswrapper[4974]: E1013 18:14:59.811539 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.811737 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:14:59 crc kubenswrapper[4974]: E1013 18:14:59.811923 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.861580 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.861682 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.861726 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.861752 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.861770 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:59Z","lastTransitionTime":"2025-10-13T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.874454 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.898766 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.920263 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.942321 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.961741 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.964503 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.964551 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.964570 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.964594 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.964614 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:14:59Z","lastTransitionTime":"2025-10-13T18:14:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.982358 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:14:59 crc kubenswrapper[4974]: I1013 18:14:59.999411 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:14:59Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.019378 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.052683 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:57Z\\\",\\\"message\\\":\\\" 6328 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1013 18:14:57.469292 6328 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:14:57.469370 6328 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:57.469413 6328 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:14:57.469423 6328 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:14:57.469441 6328 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:57.469452 6328 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:14:57.469488 6328 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:57.469525 6328 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:14:57.469528 6328 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:57.469553 6328 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:57.469541 6328 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:57.469555 6328 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1013 18:14:57.469584 6328 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:57.469608 6328 factory.go:656] Stopping watch factory\\\\nI1013 18:14:57.469627 6328 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.067638 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.067731 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.067751 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.067776 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.067794 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:00Z","lastTransitionTime":"2025-10-13T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.087496 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.101271 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/1.log" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.102330 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/0.log" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.106461 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5" exitCode=1 Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.106519 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5"} Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.106609 4974 scope.go:117] "RemoveContainer" containerID="bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.108796 4974 scope.go:117] "RemoveContainer" containerID="78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5" Oct 13 18:15:00 crc kubenswrapper[4974]: E1013 18:15:00.109158 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.114175 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.135410 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.170457 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.170497 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.170509 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.170526 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.170539 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:00Z","lastTransitionTime":"2025-10-13T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.177199 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.201824 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.221826 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.238590 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.251440 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.263789 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.272812 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.272850 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.272862 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.272882 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.272895 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:00Z","lastTransitionTime":"2025-10-13T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.275361 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.288267 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.299334 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.309720 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.320753 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.346395 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:57Z\\\",\\\"message\\\":\\\" 6328 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1013 18:14:57.469292 6328 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:14:57.469370 6328 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:57.469413 6328 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:14:57.469423 6328 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:14:57.469441 6328 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:57.469452 6328 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:14:57.469488 6328 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:57.469525 6328 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:14:57.469528 6328 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:57.469553 6328 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:57.469541 6328 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:57.469555 6328 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1013 18:14:57.469584 6328 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:57.469608 6328 factory.go:656] Stopping watch factory\\\\nI1013 18:14:57.469627 6328 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"message\\\":\\\"\\\\nI1013 18:14:59.254053 6446 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254192 6446 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254628 6446 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:59.254714 6446 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:59.254789 6446 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:59.254819 6446 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:59.254919 6446 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 18:14:59.254949 6446 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 18:14:59.254994 6446 factory.go:656] Stopping watch factory\\\\nI1013 18:14:59.255032 6446 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18:14:59.255087 6446 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:59.255144 6446 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:59.255236 6446 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:14:59.255152 6446 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:59.255153 6446 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.357730 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.371485 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.375720 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.375866 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.375954 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.376022 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.376081 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:00Z","lastTransitionTime":"2025-10-13T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.383758 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.396580 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.399212 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb"] Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.399859 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.405777 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.406029 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.421047 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.437205 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.447067 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.462098 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.474768 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.478229 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.478279 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.478295 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.478318 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.478332 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:00Z","lastTransitionTime":"2025-10-13T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.496519 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.507101 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/744ca489-ade8-41c6-94da-0b2d51a9ca6b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rt4jb\" (UID: \"744ca489-ade8-41c6-94da-0b2d51a9ca6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.507220 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc2ds\" (UniqueName: \"kubernetes.io/projected/744ca489-ade8-41c6-94da-0b2d51a9ca6b-kube-api-access-nc2ds\") pod \"ovnkube-control-plane-749d76644c-rt4jb\" (UID: \"744ca489-ade8-41c6-94da-0b2d51a9ca6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.507308 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/744ca489-ade8-41c6-94da-0b2d51a9ca6b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rt4jb\" (UID: \"744ca489-ade8-41c6-94da-0b2d51a9ca6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.507343 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/744ca489-ade8-41c6-94da-0b2d51a9ca6b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rt4jb\" (UID: \"744ca489-ade8-41c6-94da-0b2d51a9ca6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.514030 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.529841 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.548352 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.567749 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.578769 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.580740 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.580773 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.580784 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.580802 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.580813 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:00Z","lastTransitionTime":"2025-10-13T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.587789 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.608062 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/744ca489-ade8-41c6-94da-0b2d51a9ca6b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rt4jb\" (UID: \"744ca489-ade8-41c6-94da-0b2d51a9ca6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.608115 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/744ca489-ade8-41c6-94da-0b2d51a9ca6b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rt4jb\" (UID: \"744ca489-ade8-41c6-94da-0b2d51a9ca6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.608156 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/744ca489-ade8-41c6-94da-0b2d51a9ca6b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rt4jb\" (UID: \"744ca489-ade8-41c6-94da-0b2d51a9ca6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.608178 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc2ds\" (UniqueName: \"kubernetes.io/projected/744ca489-ade8-41c6-94da-0b2d51a9ca6b-kube-api-access-nc2ds\") pod \"ovnkube-control-plane-749d76644c-rt4jb\" (UID: \"744ca489-ade8-41c6-94da-0b2d51a9ca6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.607962 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:57Z\\\",\\\"message\\\":\\\" 6328 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1013 18:14:57.469292 6328 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:14:57.469370 6328 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:57.469413 6328 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:14:57.469423 6328 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:14:57.469441 6328 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:57.469452 6328 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:14:57.469488 6328 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:57.469525 6328 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:14:57.469528 6328 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:57.469553 6328 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:57.469541 6328 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:57.469555 6328 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1013 18:14:57.469584 6328 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:57.469608 6328 factory.go:656] Stopping watch factory\\\\nI1013 18:14:57.469627 6328 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"message\\\":\\\"\\\\nI1013 18:14:59.254053 6446 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254192 6446 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254628 6446 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:59.254714 6446 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:59.254789 6446 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:59.254819 6446 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:59.254919 6446 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 18:14:59.254949 6446 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 18:14:59.254994 6446 factory.go:656] Stopping watch factory\\\\nI1013 18:14:59.255032 6446 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18:14:59.255087 6446 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:59.255144 6446 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:59.255236 6446 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:14:59.255152 6446 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:59.255153 6446 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.608633 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/744ca489-ade8-41c6-94da-0b2d51a9ca6b-env-overrides\") pod \"ovnkube-control-plane-749d76644c-rt4jb\" (UID: \"744ca489-ade8-41c6-94da-0b2d51a9ca6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.608904 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/744ca489-ade8-41c6-94da-0b2d51a9ca6b-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-rt4jb\" (UID: \"744ca489-ade8-41c6-94da-0b2d51a9ca6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.615617 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/744ca489-ade8-41c6-94da-0b2d51a9ca6b-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-rt4jb\" (UID: \"744ca489-ade8-41c6-94da-0b2d51a9ca6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.619746 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.628239 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc2ds\" (UniqueName: \"kubernetes.io/projected/744ca489-ade8-41c6-94da-0b2d51a9ca6b-kube-api-access-nc2ds\") pod \"ovnkube-control-plane-749d76644c-rt4jb\" (UID: \"744ca489-ade8-41c6-94da-0b2d51a9ca6b\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.630890 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.641156 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.652924 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.674110 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.683478 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.683510 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.683520 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.683537 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.683548 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:00Z","lastTransitionTime":"2025-10-13T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.692206 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:00Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.718741 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" Oct 13 18:15:00 crc kubenswrapper[4974]: W1013 18:15:00.731476 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod744ca489_ade8_41c6_94da_0b2d51a9ca6b.slice/crio-c200a75f28c68843ab1d4e6e2c702188f1c53fe9f4d38acf9bdf6f208bed624b WatchSource:0}: Error finding container c200a75f28c68843ab1d4e6e2c702188f1c53fe9f4d38acf9bdf6f208bed624b: Status 404 returned error can't find the container with id c200a75f28c68843ab1d4e6e2c702188f1c53fe9f4d38acf9bdf6f208bed624b Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.786796 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.786859 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.786875 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.786901 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.786919 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:00Z","lastTransitionTime":"2025-10-13T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.888993 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.889036 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.889047 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.889065 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.889075 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:00Z","lastTransitionTime":"2025-10-13T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.991592 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.991665 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.991680 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.991695 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:00 crc kubenswrapper[4974]: I1013 18:15:00.992048 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:00Z","lastTransitionTime":"2025-10-13T18:15:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.095178 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.095204 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.095212 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.095224 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.095233 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:01Z","lastTransitionTime":"2025-10-13T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.111025 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" event={"ID":"744ca489-ade8-41c6-94da-0b2d51a9ca6b","Type":"ContainerStarted","Data":"11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.111062 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" event={"ID":"744ca489-ade8-41c6-94da-0b2d51a9ca6b","Type":"ContainerStarted","Data":"aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.111071 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" event={"ID":"744ca489-ade8-41c6-94da-0b2d51a9ca6b","Type":"ContainerStarted","Data":"c200a75f28c68843ab1d4e6e2c702188f1c53fe9f4d38acf9bdf6f208bed624b"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.112801 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/1.log" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.126740 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.139675 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.151393 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.171528 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:57Z\\\",\\\"message\\\":\\\" 6328 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1013 18:14:57.469292 6328 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:14:57.469370 6328 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:57.469413 6328 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:14:57.469423 6328 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:14:57.469441 6328 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:57.469452 6328 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:14:57.469488 6328 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:57.469525 6328 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:14:57.469528 6328 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:57.469553 6328 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:57.469541 6328 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:57.469555 6328 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1013 18:14:57.469584 6328 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:57.469608 6328 factory.go:656] Stopping watch factory\\\\nI1013 18:14:57.469627 6328 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"message\\\":\\\"\\\\nI1013 18:14:59.254053 6446 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254192 6446 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254628 6446 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:59.254714 6446 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:59.254789 6446 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:59.254819 6446 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:59.254919 6446 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 18:14:59.254949 6446 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 18:14:59.254994 6446 factory.go:656] Stopping watch factory\\\\nI1013 18:14:59.255032 6446 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18:14:59.255087 6446 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:59.255144 6446 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:59.255236 6446 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:14:59.255152 6446 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:59.255153 6446 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.188169 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.197747 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.197779 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.197787 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.197801 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.197811 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:01Z","lastTransitionTime":"2025-10-13T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.202952 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.213911 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.228555 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.249553 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.263340 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.276414 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.292591 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.300678 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.300716 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.300725 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.300742 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.300752 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:01Z","lastTransitionTime":"2025-10-13T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.307146 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.319638 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.334290 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.347153 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.403405 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.403471 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.403491 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.403514 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.403535 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:01Z","lastTransitionTime":"2025-10-13T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.509945 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.510007 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.510034 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.510061 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.510101 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:01Z","lastTransitionTime":"2025-10-13T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.516709 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.517085 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:15:17.517005347 +0000 UTC m=+52.421371467 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.613972 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.614058 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.614084 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.614118 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.614140 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:01Z","lastTransitionTime":"2025-10-13T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.617677 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.617745 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.617788 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.617822 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.617864 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.617901 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.617922 4974 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.617963 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.617991 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.617998 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 18:15:17.61797518 +0000 UTC m=+52.522341290 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.617990 4974 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.618019 4974 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.618061 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:15:17.618045522 +0000 UTC m=+52.522411632 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.618063 4974 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.618106 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 18:15:17.618080553 +0000 UTC m=+52.522446673 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.618171 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:15:17.618136275 +0000 UTC m=+52.522502395 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.716869 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.716941 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.716967 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.717000 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.717025 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:01Z","lastTransitionTime":"2025-10-13T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.811330 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.811386 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.811330 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.811547 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.811767 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.811860 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.819149 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.819207 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.819234 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.819263 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.819286 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:01Z","lastTransitionTime":"2025-10-13T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.902077 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-z9hj4"] Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.902629 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:01 crc kubenswrapper[4974]: E1013 18:15:01.902743 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.917931 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.921807 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.921859 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.921872 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.921893 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.921905 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:01Z","lastTransitionTime":"2025-10-13T18:15:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.953009 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:57Z\\\",\\\"message\\\":\\\" 6328 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1013 18:14:57.469292 6328 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:14:57.469370 6328 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:57.469413 6328 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:14:57.469423 6328 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:14:57.469441 6328 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:57.469452 6328 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:14:57.469488 6328 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:57.469525 6328 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:14:57.469528 6328 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:57.469553 6328 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:57.469541 6328 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:57.469555 6328 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1013 18:14:57.469584 6328 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:57.469608 6328 factory.go:656] Stopping watch factory\\\\nI1013 18:14:57.469627 6328 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"message\\\":\\\"\\\\nI1013 18:14:59.254053 6446 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254192 6446 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254628 6446 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:59.254714 6446 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:59.254789 6446 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:59.254819 6446 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:59.254919 6446 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 18:14:59.254949 6446 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 18:14:59.254994 6446 factory.go:656] Stopping watch factory\\\\nI1013 18:14:59.255032 6446 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18:14:59.255087 6446 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:59.255144 6446 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:59.255236 6446 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:14:59.255152 6446 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:59.255153 6446 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.972296 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:01 crc kubenswrapper[4974]: I1013 18:15:01.992626 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:01Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.011429 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.021805 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.021922 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7spln\" (UniqueName: \"kubernetes.io/projected/a260247c-2399-42b5-bddc-73e38659680b-kube-api-access-7spln\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.024471 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.024524 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.024542 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.024566 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.024584 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.031558 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.049414 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.075955 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.094915 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.111961 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.122993 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.123083 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7spln\" (UniqueName: \"kubernetes.io/projected/a260247c-2399-42b5-bddc-73e38659680b-kube-api-access-7spln\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:02 crc kubenswrapper[4974]: E1013 18:15:02.123269 4974 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:02 crc kubenswrapper[4974]: E1013 18:15:02.123588 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs podName:a260247c-2399-42b5-bddc-73e38659680b nodeName:}" failed. No retries permitted until 2025-10-13 18:15:02.623551066 +0000 UTC m=+37.527917216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs") pod "network-metrics-daemon-z9hj4" (UID: "a260247c-2399-42b5-bddc-73e38659680b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.128077 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.128145 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.128169 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.128198 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.128220 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.132649 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.149504 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7spln\" (UniqueName: \"kubernetes.io/projected/a260247c-2399-42b5-bddc-73e38659680b-kube-api-access-7spln\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.157926 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.179011 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.194041 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.194083 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.194096 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.194113 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.194124 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.202530 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: E1013 18:15:02.212145 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.217409 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.217451 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.217469 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.217492 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.217510 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.225041 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: E1013 18:15:02.238812 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.244303 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.244399 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.244427 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.244460 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.244489 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.245640 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: E1013 18:15:02.266491 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.272201 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.272241 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.272255 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.272277 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.272293 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.277169 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: E1013 18:15:02.294718 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.298811 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.298861 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.298878 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.298902 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.298923 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: E1013 18:15:02.319540 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:02Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:02 crc kubenswrapper[4974]: E1013 18:15:02.319720 4974 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.322582 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.322624 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.322638 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.322684 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.322702 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.427329 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.427381 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.427400 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.427454 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.427478 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.531288 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.531594 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.531762 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.531896 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.532019 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.628009 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:02 crc kubenswrapper[4974]: E1013 18:15:02.628281 4974 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:02 crc kubenswrapper[4974]: E1013 18:15:02.628404 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs podName:a260247c-2399-42b5-bddc-73e38659680b nodeName:}" failed. No retries permitted until 2025-10-13 18:15:03.628373971 +0000 UTC m=+38.532740091 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs") pod "network-metrics-daemon-z9hj4" (UID: "a260247c-2399-42b5-bddc-73e38659680b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.637685 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.637750 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.637774 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.637803 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.637825 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.741892 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.742385 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.742509 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.742638 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.742802 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.847217 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.847289 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.847314 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.847350 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.847372 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.951173 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.951225 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.951243 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.951267 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:02 crc kubenswrapper[4974]: I1013 18:15:02.951283 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:02Z","lastTransitionTime":"2025-10-13T18:15:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.054965 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.055548 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.055626 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.055749 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.055830 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:03Z","lastTransitionTime":"2025-10-13T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.159152 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.159209 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.159280 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.159307 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.159325 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:03Z","lastTransitionTime":"2025-10-13T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.263248 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.263312 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.263326 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.263349 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.263364 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:03Z","lastTransitionTime":"2025-10-13T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.367013 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.367080 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.367100 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.367127 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.367147 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:03Z","lastTransitionTime":"2025-10-13T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.470149 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.470206 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.470225 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.470252 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.470271 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:03Z","lastTransitionTime":"2025-10-13T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.573380 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.573463 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.573488 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.573516 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.573535 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:03Z","lastTransitionTime":"2025-10-13T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.638948 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:03 crc kubenswrapper[4974]: E1013 18:15:03.639196 4974 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:03 crc kubenswrapper[4974]: E1013 18:15:03.639298 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs podName:a260247c-2399-42b5-bddc-73e38659680b nodeName:}" failed. No retries permitted until 2025-10-13 18:15:05.639275206 +0000 UTC m=+40.543641296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs") pod "network-metrics-daemon-z9hj4" (UID: "a260247c-2399-42b5-bddc-73e38659680b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.677508 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.677583 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.677606 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.677636 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.677694 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:03Z","lastTransitionTime":"2025-10-13T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.781289 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.781383 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.781402 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.781458 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.781476 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:03Z","lastTransitionTime":"2025-10-13T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.811715 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.811816 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.811838 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.811744 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:03 crc kubenswrapper[4974]: E1013 18:15:03.812007 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:03 crc kubenswrapper[4974]: E1013 18:15:03.812146 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:03 crc kubenswrapper[4974]: E1013 18:15:03.812311 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:03 crc kubenswrapper[4974]: E1013 18:15:03.812609 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.884434 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.884503 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.884522 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.884548 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.884566 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:03Z","lastTransitionTime":"2025-10-13T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.987436 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.987501 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.987518 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.987545 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:03 crc kubenswrapper[4974]: I1013 18:15:03.987562 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:03Z","lastTransitionTime":"2025-10-13T18:15:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.091291 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.091368 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.091386 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.091490 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.091520 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:04Z","lastTransitionTime":"2025-10-13T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.194314 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.194378 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.194396 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.194420 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.194439 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:04Z","lastTransitionTime":"2025-10-13T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.305026 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.305091 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.305109 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.305134 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.305153 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:04Z","lastTransitionTime":"2025-10-13T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.408831 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.408905 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.408930 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.408964 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.408986 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:04Z","lastTransitionTime":"2025-10-13T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.512555 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.512632 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.512682 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.512711 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.512729 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:04Z","lastTransitionTime":"2025-10-13T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.616006 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.616067 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.616086 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.616113 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.616132 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:04Z","lastTransitionTime":"2025-10-13T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.719367 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.719435 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.719455 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.719482 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.719503 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:04Z","lastTransitionTime":"2025-10-13T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.822737 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.822804 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.822822 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.822850 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.822871 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:04Z","lastTransitionTime":"2025-10-13T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.927005 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.927069 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.927086 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.927114 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:04 crc kubenswrapper[4974]: I1013 18:15:04.927134 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:04Z","lastTransitionTime":"2025-10-13T18:15:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.031285 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.031375 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.031397 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.031425 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.031445 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:05Z","lastTransitionTime":"2025-10-13T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.134943 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.135022 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.135048 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.135085 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.135107 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:05Z","lastTransitionTime":"2025-10-13T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.241783 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.241833 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.241845 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.241866 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.241880 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:05Z","lastTransitionTime":"2025-10-13T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.344882 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.344957 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.344978 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.345012 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.345032 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:05Z","lastTransitionTime":"2025-10-13T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.449032 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.449084 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.449094 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.449111 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.449121 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:05Z","lastTransitionTime":"2025-10-13T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.553239 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.553459 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.553485 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.553522 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.553548 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:05Z","lastTransitionTime":"2025-10-13T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.656065 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.656139 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.656158 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.656188 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.656216 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:05Z","lastTransitionTime":"2025-10-13T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.693541 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:05 crc kubenswrapper[4974]: E1013 18:15:05.693832 4974 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:05 crc kubenswrapper[4974]: E1013 18:15:05.693934 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs podName:a260247c-2399-42b5-bddc-73e38659680b nodeName:}" failed. No retries permitted until 2025-10-13 18:15:09.693903987 +0000 UTC m=+44.598270107 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs") pod "network-metrics-daemon-z9hj4" (UID: "a260247c-2399-42b5-bddc-73e38659680b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.758816 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.758886 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.758908 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.758935 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.758955 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:05Z","lastTransitionTime":"2025-10-13T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.810895 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.810980 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.811088 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:05 crc kubenswrapper[4974]: E1013 18:15:05.811089 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.811133 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:05 crc kubenswrapper[4974]: E1013 18:15:05.811200 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:05 crc kubenswrapper[4974]: E1013 18:15:05.811353 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:05 crc kubenswrapper[4974]: E1013 18:15:05.811443 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.835467 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.858140 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.863332 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.863379 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.863397 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.863422 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.863442 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:05Z","lastTransitionTime":"2025-10-13T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.881733 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.906566 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.928570 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.949585 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.967161 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.967216 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.967228 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.967252 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.967268 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:05Z","lastTransitionTime":"2025-10-13T18:15:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:05 crc kubenswrapper[4974]: I1013 18:15:05.972715 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.005650 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdd4770e2bd9545cf7fa4ab742fddd355eabd4c31a76a1769529d90d07fcf111\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:57Z\\\",\\\"message\\\":\\\" 6328 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1013 18:14:57.469292 6328 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:14:57.469370 6328 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:57.469413 6328 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:14:57.469423 6328 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:14:57.469441 6328 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:57.469452 6328 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:14:57.469488 6328 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:57.469525 6328 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:14:57.469528 6328 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:57.469553 6328 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:57.469541 6328 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:57.469555 6328 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1013 18:14:57.469584 6328 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:57.469608 6328 factory.go:656] Stopping watch factory\\\\nI1013 18:14:57.469627 6328 ovnkube.go:599] Stopped ovnkube\\\\nI10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"message\\\":\\\"\\\\nI1013 18:14:59.254053 6446 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254192 6446 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254628 6446 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:59.254714 6446 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:59.254789 6446 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:59.254819 6446 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:59.254919 6446 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 18:14:59.254949 6446 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 18:14:59.254994 6446 factory.go:656] Stopping watch factory\\\\nI1013 18:14:59.255032 6446 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18:14:59.255087 6446 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:59.255144 6446 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:59.255236 6446 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:14:59.255152 6446 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:59.255153 6446 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.029564 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.055212 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.070120 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.070201 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.070229 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.070258 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.070276 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:06Z","lastTransitionTime":"2025-10-13T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.075353 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.089801 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.106725 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.144815 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.166830 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.172894 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.172960 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.172972 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.172995 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.173037 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:06Z","lastTransitionTime":"2025-10-13T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.180917 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.200308 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.304805 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.304866 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.304884 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.304917 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.304938 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:06Z","lastTransitionTime":"2025-10-13T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.407691 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.407770 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.407788 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.407822 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.407842 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:06Z","lastTransitionTime":"2025-10-13T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.510264 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.510324 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.510343 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.510369 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.510385 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:06Z","lastTransitionTime":"2025-10-13T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.614708 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.614780 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.614799 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.614826 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.614848 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:06Z","lastTransitionTime":"2025-10-13T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.718769 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.718836 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.718847 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.718877 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.718891 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:06Z","lastTransitionTime":"2025-10-13T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.822924 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.822982 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.823000 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.823025 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.823043 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:06Z","lastTransitionTime":"2025-10-13T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.928022 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.928115 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.928140 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.928174 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:06 crc kubenswrapper[4974]: I1013 18:15:06.928199 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:06Z","lastTransitionTime":"2025-10-13T18:15:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.031337 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.031400 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.031418 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.031444 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.031461 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:07Z","lastTransitionTime":"2025-10-13T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.135397 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.135457 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.135500 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.135525 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.135543 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:07Z","lastTransitionTime":"2025-10-13T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.238485 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.238562 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.238581 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.238610 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.238631 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:07Z","lastTransitionTime":"2025-10-13T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.342724 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.342791 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.342815 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.342848 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.342869 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:07Z","lastTransitionTime":"2025-10-13T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.446460 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.446538 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.446551 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.446579 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.446600 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:07Z","lastTransitionTime":"2025-10-13T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.550553 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.550841 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.550862 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.550891 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.550909 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:07Z","lastTransitionTime":"2025-10-13T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.654692 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.654797 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.654816 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.654851 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.654881 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:07Z","lastTransitionTime":"2025-10-13T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.758293 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.758367 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.758390 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.758422 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.758448 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:07Z","lastTransitionTime":"2025-10-13T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.811142 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.811196 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.811277 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:07 crc kubenswrapper[4974]: E1013 18:15:07.811311 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.811368 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:07 crc kubenswrapper[4974]: E1013 18:15:07.811604 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:07 crc kubenswrapper[4974]: E1013 18:15:07.811786 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:07 crc kubenswrapper[4974]: E1013 18:15:07.811889 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.862033 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.862090 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.862103 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.862125 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.862142 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:07Z","lastTransitionTime":"2025-10-13T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.965719 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.965776 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.965796 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.965820 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:07 crc kubenswrapper[4974]: I1013 18:15:07.965838 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:07Z","lastTransitionTime":"2025-10-13T18:15:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.069625 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.069735 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.069759 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.069797 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.069821 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:08Z","lastTransitionTime":"2025-10-13T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.172310 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.172374 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.172395 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.172422 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.172443 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:08Z","lastTransitionTime":"2025-10-13T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.277973 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.278030 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.278053 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.278087 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.278109 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:08Z","lastTransitionTime":"2025-10-13T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.381940 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.382020 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.382045 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.382075 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.382100 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:08Z","lastTransitionTime":"2025-10-13T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.484873 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.484932 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.484950 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.484974 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.484994 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:08Z","lastTransitionTime":"2025-10-13T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.588236 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.588278 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.588294 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.588314 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.588335 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:08Z","lastTransitionTime":"2025-10-13T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.691097 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.691152 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.691173 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.691197 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.691212 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:08Z","lastTransitionTime":"2025-10-13T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.794837 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.794963 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.794989 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.795020 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.795040 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:08Z","lastTransitionTime":"2025-10-13T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.898322 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.898380 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.898397 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.898421 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:08 crc kubenswrapper[4974]: I1013 18:15:08.898438 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:08Z","lastTransitionTime":"2025-10-13T18:15:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.002783 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.002886 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.002906 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.002939 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.002961 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:09Z","lastTransitionTime":"2025-10-13T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.106040 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.106199 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.106233 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.106267 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.106291 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:09Z","lastTransitionTime":"2025-10-13T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.209306 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.209368 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.209393 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.209425 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.209455 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:09Z","lastTransitionTime":"2025-10-13T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.312370 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.312432 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.312455 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.312484 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.312505 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:09Z","lastTransitionTime":"2025-10-13T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.415630 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.415725 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.415749 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.415816 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.415840 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:09Z","lastTransitionTime":"2025-10-13T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.519106 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.519166 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.519190 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.519219 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.519238 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:09Z","lastTransitionTime":"2025-10-13T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.623300 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.623387 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.623446 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.623484 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.623504 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:09Z","lastTransitionTime":"2025-10-13T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.726282 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.726352 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.726379 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.726410 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.726433 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:09Z","lastTransitionTime":"2025-10-13T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.746231 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:09 crc kubenswrapper[4974]: E1013 18:15:09.746477 4974 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:09 crc kubenswrapper[4974]: E1013 18:15:09.746581 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs podName:a260247c-2399-42b5-bddc-73e38659680b nodeName:}" failed. No retries permitted until 2025-10-13 18:15:17.746557198 +0000 UTC m=+52.650923308 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs") pod "network-metrics-daemon-z9hj4" (UID: "a260247c-2399-42b5-bddc-73e38659680b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.811260 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.811337 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.811413 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:09 crc kubenswrapper[4974]: E1013 18:15:09.811620 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:09 crc kubenswrapper[4974]: E1013 18:15:09.811857 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.812010 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:09 crc kubenswrapper[4974]: E1013 18:15:09.812019 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:09 crc kubenswrapper[4974]: E1013 18:15:09.812193 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.831038 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.831135 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.831152 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.831172 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.831220 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:09Z","lastTransitionTime":"2025-10-13T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.934547 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.934617 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.934641 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.934722 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:09 crc kubenswrapper[4974]: I1013 18:15:09.934749 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:09Z","lastTransitionTime":"2025-10-13T18:15:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.037750 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.038292 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.038422 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.038550 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.038801 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:10Z","lastTransitionTime":"2025-10-13T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.141009 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.141071 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.141087 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.141104 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.141116 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:10Z","lastTransitionTime":"2025-10-13T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.244864 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.244930 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.244953 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.244981 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.245003 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:10Z","lastTransitionTime":"2025-10-13T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.348639 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.348993 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.349190 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.349365 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.349577 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:10Z","lastTransitionTime":"2025-10-13T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.453228 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.453294 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.453313 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.453339 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.453357 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:10Z","lastTransitionTime":"2025-10-13T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.561964 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.562029 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.562710 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.562753 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.562774 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:10Z","lastTransitionTime":"2025-10-13T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.665759 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.665808 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.665826 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.665849 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.665868 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:10Z","lastTransitionTime":"2025-10-13T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.769397 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.769865 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.770092 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.770261 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.770406 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:10Z","lastTransitionTime":"2025-10-13T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.874340 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.874423 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.874443 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.874475 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.874497 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:10Z","lastTransitionTime":"2025-10-13T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.978439 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.978761 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.978940 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.979093 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:10 crc kubenswrapper[4974]: I1013 18:15:10.979236 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:10Z","lastTransitionTime":"2025-10-13T18:15:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.082236 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.082305 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.082322 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.082349 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.082368 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:11Z","lastTransitionTime":"2025-10-13T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.185110 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.185171 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.185190 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.185215 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.185234 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:11Z","lastTransitionTime":"2025-10-13T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.287819 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.287900 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.287917 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.287944 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.287961 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:11Z","lastTransitionTime":"2025-10-13T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.390786 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.390847 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.390865 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.390891 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.390912 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:11Z","lastTransitionTime":"2025-10-13T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.493934 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.494000 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.494023 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.494051 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.494072 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:11Z","lastTransitionTime":"2025-10-13T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.597245 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.597334 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.597353 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.597377 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.597394 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:11Z","lastTransitionTime":"2025-10-13T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.700692 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.700750 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.700768 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.700794 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.700812 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:11Z","lastTransitionTime":"2025-10-13T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.803905 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.803969 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.803992 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.804022 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.804045 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:11Z","lastTransitionTime":"2025-10-13T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.811303 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.811322 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.811378 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.811399 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:11 crc kubenswrapper[4974]: E1013 18:15:11.812292 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:11 crc kubenswrapper[4974]: E1013 18:15:11.812401 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.812587 4974 scope.go:117] "RemoveContainer" containerID="78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5" Oct 13 18:15:11 crc kubenswrapper[4974]: E1013 18:15:11.812593 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:11 crc kubenswrapper[4974]: E1013 18:15:11.812794 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.836896 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:11Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.854724 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:11Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.877632 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:11Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.897253 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:11Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.908277 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.908750 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.908768 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.908792 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.908808 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:11Z","lastTransitionTime":"2025-10-13T18:15:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.924763 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:11Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.941184 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:11Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.952829 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:11Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:11 crc kubenswrapper[4974]: I1013 18:15:11.977578 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"message\\\":\\\"\\\\nI1013 18:14:59.254053 6446 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254192 6446 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254628 6446 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:59.254714 6446 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:59.254789 6446 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:59.254819 6446 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:59.254919 6446 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 18:14:59.254949 6446 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 18:14:59.254994 6446 factory.go:656] Stopping watch factory\\\\nI1013 18:14:59.255032 6446 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18:14:59.255087 6446 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:59.255144 6446 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:59.255236 6446 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:14:59.255152 6446 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:59.255153 6446 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:11Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.002005 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:11Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.012441 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.012477 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.012491 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.012510 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.012525 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.019178 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.030390 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.043224 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.057549 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.081249 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.100629 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.115042 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.115123 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.115135 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.115152 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.115187 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.118457 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.133713 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.160914 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/1.log" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.163144 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21"} Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.163248 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.178218 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.188951 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.208489 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.217990 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.218055 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.218079 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.218115 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.218140 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.222365 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.233566 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.251334 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.268112 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.285334 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.308194 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.320870 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.320927 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.320943 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.320968 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.320982 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.326125 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.344496 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.375205 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.391068 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.415996 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"message\\\":\\\"\\\\nI1013 18:14:59.254053 6446 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254192 6446 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254628 6446 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:59.254714 6446 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:59.254789 6446 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:59.254819 6446 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:59.254919 6446 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 18:14:59.254949 6446 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 18:14:59.254994 6446 factory.go:656] Stopping watch factory\\\\nI1013 18:14:59.255032 6446 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18:14:59.255087 6446 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:59.255144 6446 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:59.255236 6446 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:14:59.255152 6446 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:59.255153 6446 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.427280 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.427359 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.427383 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.427525 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.427671 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.435905 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.454000 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.468103 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.477735 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.477782 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.477794 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.477818 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.477831 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: E1013 18:15:12.495906 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.500678 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.500725 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.500737 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.500763 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.500776 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: E1013 18:15:12.518937 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.523563 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.523612 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.523624 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.523669 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.523687 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: E1013 18:15:12.538361 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.542438 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.542476 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.542485 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.542507 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.542520 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: E1013 18:15:12.555625 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.561045 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.561117 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.561126 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.561145 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.561156 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: E1013 18:15:12.591072 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:12Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:12 crc kubenswrapper[4974]: E1013 18:15:12.591231 4974 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.592949 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.593012 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.593027 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.593050 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.593064 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.694933 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.694959 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.694969 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.694983 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.694993 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.797350 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.797411 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.797434 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.797460 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.797477 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.900282 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.900329 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.900341 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.900356 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:12 crc kubenswrapper[4974]: I1013 18:15:12.900367 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:12Z","lastTransitionTime":"2025-10-13T18:15:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.002423 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.002466 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.002479 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.002496 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.002528 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:13Z","lastTransitionTime":"2025-10-13T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.105307 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.105335 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.105344 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.105356 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.105365 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:13Z","lastTransitionTime":"2025-10-13T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.168199 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/2.log" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.169066 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/1.log" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.172451 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21" exitCode=1 Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.172513 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21"} Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.172576 4974 scope.go:117] "RemoveContainer" containerID="78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.173824 4974 scope.go:117] "RemoveContainer" containerID="d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21" Oct 13 18:15:13 crc kubenswrapper[4974]: E1013 18:15:13.174148 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.195056 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.210333 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.210611 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.210722 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.210809 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.210885 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:13Z","lastTransitionTime":"2025-10-13T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.211681 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.223962 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.246157 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78511d72fe58c059708228089e4e6b317180543ddcc161462f71cbc71b7d98b5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"message\\\":\\\"\\\\nI1013 18:14:59.254053 6446 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254192 6446 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1013 18:14:59.254628 6446 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1013 18:14:59.254714 6446 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:14:59.254789 6446 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:14:59.254819 6446 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:14:59.254919 6446 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1013 18:14:59.254949 6446 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1013 18:14:59.254994 6446 factory.go:656] Stopping watch factory\\\\nI1013 18:14:59.255032 6446 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18:14:59.255087 6446 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1013 18:14:59.255144 6446 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:14:59.255236 6446 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:14:59.255152 6446 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:14:59.255153 6446 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubernetes/ovnkube-node-zwcs8 after 0 failed attempt(s)\\\\nI1013 18:15:12.935117 6665 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-zwcs8\\\\nI1013 18:15:12.935120 6665 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nI1013 18:15:12.935110 6665 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI1013 18:15:12.935135 6665 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nF1013 18:15:12.935141 6665 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: cer\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.266023 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.284874 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.302395 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.313838 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.314006 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.314083 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.314283 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.314400 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:13Z","lastTransitionTime":"2025-10-13T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.319456 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.335749 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.371296 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.392493 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.402371 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.415816 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.420423 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.420489 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.420508 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.420534 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.420552 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:13Z","lastTransitionTime":"2025-10-13T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.437171 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.454819 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.475851 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.501508 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.519279 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:13Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.523802 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.523837 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.523848 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.523872 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.523882 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:13Z","lastTransitionTime":"2025-10-13T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.626140 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.626530 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.626617 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.626750 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.626832 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:13Z","lastTransitionTime":"2025-10-13T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.728882 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.729335 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.729738 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.729940 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.730326 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:13Z","lastTransitionTime":"2025-10-13T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.811591 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.812058 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:13 crc kubenswrapper[4974]: E1013 18:15:13.812317 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.812351 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:13 crc kubenswrapper[4974]: E1013 18:15:13.812519 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:13 crc kubenswrapper[4974]: E1013 18:15:13.812703 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.812383 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:13 crc kubenswrapper[4974]: E1013 18:15:13.812969 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.833849 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.833895 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.833905 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.833924 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.833938 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:13Z","lastTransitionTime":"2025-10-13T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.937681 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.938142 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.938833 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.939090 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:13 crc kubenswrapper[4974]: I1013 18:15:13.939300 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:13Z","lastTransitionTime":"2025-10-13T18:15:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.046363 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.046443 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.046466 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.046493 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.046512 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:14Z","lastTransitionTime":"2025-10-13T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.149669 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.150084 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.150147 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.150222 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.150280 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:14Z","lastTransitionTime":"2025-10-13T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.178943 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/2.log" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.184769 4974 scope.go:117] "RemoveContainer" containerID="d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21" Oct 13 18:15:14 crc kubenswrapper[4974]: E1013 18:15:14.185043 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.205478 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.219209 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.239036 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.256394 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.256456 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.256476 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.256509 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.256532 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:14Z","lastTransitionTime":"2025-10-13T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.258385 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.270984 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.282258 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.297008 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.326585 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubernetes/ovnkube-node-zwcs8 after 0 failed attempt(s)\\\\nI1013 18:15:12.935117 6665 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-zwcs8\\\\nI1013 18:15:12.935120 6665 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nI1013 18:15:12.935110 6665 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI1013 18:15:12.935135 6665 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nF1013 18:15:12.935141 6665 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: cer\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.360623 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.360677 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.360686 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.360702 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.360712 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:14Z","lastTransitionTime":"2025-10-13T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.362449 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.379602 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.392487 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.405419 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.415163 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.430874 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.452095 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.463587 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.463638 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.463664 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.463682 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.463693 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:14Z","lastTransitionTime":"2025-10-13T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.470635 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.486380 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.566424 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.566471 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.566481 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.566499 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.566513 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:14Z","lastTransitionTime":"2025-10-13T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.669741 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.669785 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.669796 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.669813 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.669825 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:14Z","lastTransitionTime":"2025-10-13T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.772821 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.772900 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.772922 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.772952 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.772970 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:14Z","lastTransitionTime":"2025-10-13T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.875760 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.875876 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.875892 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.875922 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.875940 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:14Z","lastTransitionTime":"2025-10-13T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.979726 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.979783 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.979802 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.979826 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:14 crc kubenswrapper[4974]: I1013 18:15:14.979848 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:14Z","lastTransitionTime":"2025-10-13T18:15:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.082432 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.082492 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.082511 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.082534 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.082552 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:15Z","lastTransitionTime":"2025-10-13T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.185363 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.185421 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.185440 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.185463 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.185483 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:15Z","lastTransitionTime":"2025-10-13T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.288850 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.288918 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.288955 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.288987 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.289012 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:15Z","lastTransitionTime":"2025-10-13T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.392051 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.392117 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.392141 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.392172 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.392192 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:15Z","lastTransitionTime":"2025-10-13T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.495591 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.495690 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.495718 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.495751 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.495774 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:15Z","lastTransitionTime":"2025-10-13T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.599121 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.599197 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.599215 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.599243 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.599260 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:15Z","lastTransitionTime":"2025-10-13T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.701672 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.701730 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.701739 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.701759 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.701771 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:15Z","lastTransitionTime":"2025-10-13T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.804888 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.804987 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.805005 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.805031 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.805047 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:15Z","lastTransitionTime":"2025-10-13T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.811362 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.811426 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.811494 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:15 crc kubenswrapper[4974]: E1013 18:15:15.811733 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.811808 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:15 crc kubenswrapper[4974]: E1013 18:15:15.812017 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:15 crc kubenswrapper[4974]: E1013 18:15:15.812081 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:15 crc kubenswrapper[4974]: E1013 18:15:15.812606 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.834417 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:15Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.859592 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:15Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.879128 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:15Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.900033 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:15Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.907105 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.907178 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.907199 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.907226 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.907245 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:15Z","lastTransitionTime":"2025-10-13T18:15:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.918585 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:15Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.942367 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:15Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.964505 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:15Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:15 crc kubenswrapper[4974]: I1013 18:15:15.991741 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:15Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.009469 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:16Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.012213 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.012251 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.012259 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.012278 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.012291 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:16Z","lastTransitionTime":"2025-10-13T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.024092 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:16Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.057118 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubernetes/ovnkube-node-zwcs8 after 0 failed attempt(s)\\\\nI1013 18:15:12.935117 6665 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-zwcs8\\\\nI1013 18:15:12.935120 6665 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nI1013 18:15:12.935110 6665 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI1013 18:15:12.935135 6665 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nF1013 18:15:12.935141 6665 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: cer\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:16Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.079110 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:16Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.097055 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:16Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.115487 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.115617 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.115640 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.115714 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.115731 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:16Z","lastTransitionTime":"2025-10-13T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.116241 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:16Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.135505 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:16Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.157026 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:16Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.192184 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:16Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.218543 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.218601 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.218619 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.218642 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.218685 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:16Z","lastTransitionTime":"2025-10-13T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.322228 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.322306 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.322328 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.322422 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.322466 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:16Z","lastTransitionTime":"2025-10-13T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.425521 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.425605 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.425627 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.425691 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.425716 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:16Z","lastTransitionTime":"2025-10-13T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.529119 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.529181 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.529203 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.529233 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.529254 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:16Z","lastTransitionTime":"2025-10-13T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.632684 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.632839 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.632873 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.632903 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.632925 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:16Z","lastTransitionTime":"2025-10-13T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.735214 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.735288 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.735314 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.735343 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.735364 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:16Z","lastTransitionTime":"2025-10-13T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.838292 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.838345 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.838362 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.838385 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.838403 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:16Z","lastTransitionTime":"2025-10-13T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.941376 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.941553 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.941572 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.941598 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:16 crc kubenswrapper[4974]: I1013 18:15:16.941618 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:16Z","lastTransitionTime":"2025-10-13T18:15:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.044941 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.045017 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.045035 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.045064 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.045086 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:17Z","lastTransitionTime":"2025-10-13T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.148283 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.148342 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.148359 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.148382 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.148400 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:17Z","lastTransitionTime":"2025-10-13T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.251462 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.251589 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.251622 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.251694 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.251719 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:17Z","lastTransitionTime":"2025-10-13T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.354034 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.354101 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.354118 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.354146 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.354167 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:17Z","lastTransitionTime":"2025-10-13T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.457849 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.457921 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.457938 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.457962 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.457982 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:17Z","lastTransitionTime":"2025-10-13T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.535938 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.536347 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:15:49.53629508 +0000 UTC m=+84.440661200 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.560721 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.560777 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.560794 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.560818 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.560836 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:17Z","lastTransitionTime":"2025-10-13T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.638320 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.638398 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.638444 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.638480 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.638622 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.638688 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.638703 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.638704 4974 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.638730 4974 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.638760 4974 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.638718 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.638809 4974 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.638839 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:15:49.638791966 +0000 UTC m=+84.543158086 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.638869 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 18:15:49.638856507 +0000 UTC m=+84.543222627 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.638890 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:15:49.638879668 +0000 UTC m=+84.543245778 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.638917 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 18:15:49.638906269 +0000 UTC m=+84.543272379 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.663690 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.663752 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.663770 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.663795 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.663812 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:17Z","lastTransitionTime":"2025-10-13T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.767121 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.767182 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.767205 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.767236 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.767255 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:17Z","lastTransitionTime":"2025-10-13T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.810627 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.810729 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.810835 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.811214 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.811246 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.811827 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.811381 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.811690 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.840825 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.840994 4974 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:17 crc kubenswrapper[4974]: E1013 18:15:17.841075 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs podName:a260247c-2399-42b5-bddc-73e38659680b nodeName:}" failed. No retries permitted until 2025-10-13 18:15:33.841053561 +0000 UTC m=+68.745419681 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs") pod "network-metrics-daemon-z9hj4" (UID: "a260247c-2399-42b5-bddc-73e38659680b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.870549 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.870612 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.870637 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.870699 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.870720 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:17Z","lastTransitionTime":"2025-10-13T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.973473 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.973536 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.973562 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.973592 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:17 crc kubenswrapper[4974]: I1013 18:15:17.973614 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:17Z","lastTransitionTime":"2025-10-13T18:15:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.091938 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.092348 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.092374 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.092397 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.092414 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:18Z","lastTransitionTime":"2025-10-13T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.195569 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.195625 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.195639 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.195682 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.195697 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:18Z","lastTransitionTime":"2025-10-13T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.298937 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.298999 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.299049 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.299077 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.299101 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:18Z","lastTransitionTime":"2025-10-13T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.401614 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.401994 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.402251 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.402457 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.402683 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:18Z","lastTransitionTime":"2025-10-13T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.506486 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.506530 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.506541 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.506559 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.506570 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:18Z","lastTransitionTime":"2025-10-13T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.612270 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.612335 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.612355 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.612380 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.612399 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:18Z","lastTransitionTime":"2025-10-13T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.715639 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.716837 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.717111 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.717318 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.717482 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:18Z","lastTransitionTime":"2025-10-13T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.821288 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.821734 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.821760 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.821791 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.821817 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:18Z","lastTransitionTime":"2025-10-13T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.925527 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.925587 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.925603 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.925628 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:18 crc kubenswrapper[4974]: I1013 18:15:18.925644 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:18Z","lastTransitionTime":"2025-10-13T18:15:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.029008 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.029050 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.029060 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.029075 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.029085 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:19Z","lastTransitionTime":"2025-10-13T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.131469 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.131529 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.131547 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.131573 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.131594 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:19Z","lastTransitionTime":"2025-10-13T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.234306 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.234380 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.234398 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.234423 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.234442 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:19Z","lastTransitionTime":"2025-10-13T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.338156 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.338217 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.338235 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.338266 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.338284 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:19Z","lastTransitionTime":"2025-10-13T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.442097 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.442155 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.442165 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.442188 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.442201 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:19Z","lastTransitionTime":"2025-10-13T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.545605 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.545678 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.545691 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.545712 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.545727 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:19Z","lastTransitionTime":"2025-10-13T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.649271 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.649360 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.649373 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.649391 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.649402 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:19Z","lastTransitionTime":"2025-10-13T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.753080 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.753154 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.753189 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.753222 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.753244 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:19Z","lastTransitionTime":"2025-10-13T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.811068 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.811033 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.811383 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:19 crc kubenswrapper[4974]: E1013 18:15:19.811376 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.811290 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:19 crc kubenswrapper[4974]: E1013 18:15:19.811498 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:19 crc kubenswrapper[4974]: E1013 18:15:19.811746 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:19 crc kubenswrapper[4974]: E1013 18:15:19.811844 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.857585 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.858092 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.858219 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.858345 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.858455 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:19Z","lastTransitionTime":"2025-10-13T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.961452 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.961838 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.961953 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.962108 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:19 crc kubenswrapper[4974]: I1013 18:15:19.962306 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:19Z","lastTransitionTime":"2025-10-13T18:15:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.066212 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.066287 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.066316 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.066347 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.066369 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:20Z","lastTransitionTime":"2025-10-13T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.169898 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.169942 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.169955 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.169971 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.169982 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:20Z","lastTransitionTime":"2025-10-13T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.294194 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.294269 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.294290 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.294326 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.294353 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:20Z","lastTransitionTime":"2025-10-13T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.397640 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.397756 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.397783 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.397819 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.397846 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:20Z","lastTransitionTime":"2025-10-13T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.501732 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.501792 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.501809 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.501836 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.501855 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:20Z","lastTransitionTime":"2025-10-13T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.604730 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.604787 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.604806 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.604833 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.604851 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:20Z","lastTransitionTime":"2025-10-13T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.707530 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.707590 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.707607 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.707629 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.707647 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:20Z","lastTransitionTime":"2025-10-13T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.810890 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.810952 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.810969 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.810996 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.811013 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:20Z","lastTransitionTime":"2025-10-13T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.913500 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.913529 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.913537 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.913549 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:20 crc kubenswrapper[4974]: I1013 18:15:20.913558 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:20Z","lastTransitionTime":"2025-10-13T18:15:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.016027 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.016084 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.016100 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.016121 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.016138 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:21Z","lastTransitionTime":"2025-10-13T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.119215 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.119286 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.119304 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.119332 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.119350 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:21Z","lastTransitionTime":"2025-10-13T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.223393 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.223475 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.223494 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.223522 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.223542 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:21Z","lastTransitionTime":"2025-10-13T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.243806 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.258633 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.267590 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.288878 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.308818 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.326504 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.326567 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.326587 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.326625 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.326643 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:21Z","lastTransitionTime":"2025-10-13T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.326835 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.343973 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.395400 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubernetes/ovnkube-node-zwcs8 after 0 failed attempt(s)\\\\nI1013 18:15:12.935117 6665 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-zwcs8\\\\nI1013 18:15:12.935120 6665 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nI1013 18:15:12.935110 6665 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI1013 18:15:12.935135 6665 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nF1013 18:15:12.935141 6665 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: cer\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.413048 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.429528 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.429597 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.429623 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.429684 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.429710 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:21Z","lastTransitionTime":"2025-10-13T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.433363 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.450497 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.468791 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.484647 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.518461 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.533061 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.533136 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.533158 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.533189 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.533213 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:21Z","lastTransitionTime":"2025-10-13T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.540876 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.561041 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.578431 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.599115 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.620011 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:21Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.636258 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.636295 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.636307 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.636324 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.636339 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:21Z","lastTransitionTime":"2025-10-13T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.740390 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.740452 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.740470 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.740496 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.740515 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:21Z","lastTransitionTime":"2025-10-13T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.811329 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.811366 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.811411 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:21 crc kubenswrapper[4974]: E1013 18:15:21.811513 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.811570 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:21 crc kubenswrapper[4974]: E1013 18:15:21.811713 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:21 crc kubenswrapper[4974]: E1013 18:15:21.811828 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:21 crc kubenswrapper[4974]: E1013 18:15:21.812026 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.843229 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.843289 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.843307 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.843330 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.843347 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:21Z","lastTransitionTime":"2025-10-13T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.949934 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.950070 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.950096 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.950126 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:21 crc kubenswrapper[4974]: I1013 18:15:21.950148 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:21Z","lastTransitionTime":"2025-10-13T18:15:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.053016 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.053080 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.053100 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.053126 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.053144 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.157085 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.157146 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.157164 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.157195 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.157219 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.261101 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.261172 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.261225 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.261253 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.261277 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.364378 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.364434 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.364452 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.364475 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.364493 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.468152 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.468218 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.468249 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.468278 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.468297 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.570934 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.571014 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.571039 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.571071 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.571097 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.673872 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.673936 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.673954 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.673982 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.674003 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.774781 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.774842 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.774860 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.774885 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.774903 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: E1013 18:15:22.795968 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:22Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.801854 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.801909 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.801929 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.801955 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.801974 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: E1013 18:15:22.822581 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:22Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.827763 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.827821 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.827838 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.827863 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.827883 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: E1013 18:15:22.844631 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:22Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.850459 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.850705 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.850897 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.851024 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.851169 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: E1013 18:15:22.871414 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:22Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.876286 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.876501 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.876647 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.876830 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.876971 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:22 crc kubenswrapper[4974]: E1013 18:15:22.899016 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:22Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:22 crc kubenswrapper[4974]: E1013 18:15:22.899241 4974 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.901606 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.901695 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.901715 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.901741 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:22 crc kubenswrapper[4974]: I1013 18:15:22.901760 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:22Z","lastTransitionTime":"2025-10-13T18:15:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.004839 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.005259 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.005531 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.005713 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.005879 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:23Z","lastTransitionTime":"2025-10-13T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.109329 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.109384 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.109401 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.109429 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.109445 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:23Z","lastTransitionTime":"2025-10-13T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.212309 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.212746 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.212914 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.213075 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.213229 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:23Z","lastTransitionTime":"2025-10-13T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.317606 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.317700 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.317714 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.317740 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.317758 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:23Z","lastTransitionTime":"2025-10-13T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.421344 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.421420 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.421447 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.421480 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.421504 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:23Z","lastTransitionTime":"2025-10-13T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.526179 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.526252 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.526275 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.526304 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.526328 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:23Z","lastTransitionTime":"2025-10-13T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.629642 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.629736 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.629760 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.629790 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.629814 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:23Z","lastTransitionTime":"2025-10-13T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.732612 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.732705 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.732724 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.732749 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.732765 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:23Z","lastTransitionTime":"2025-10-13T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.811552 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.811616 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.811811 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:23 crc kubenswrapper[4974]: E1013 18:15:23.811777 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.811893 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:23 crc kubenswrapper[4974]: E1013 18:15:23.811933 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:23 crc kubenswrapper[4974]: E1013 18:15:23.812124 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:23 crc kubenswrapper[4974]: E1013 18:15:23.812261 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.836286 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.836378 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.836405 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.836482 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.836511 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:23Z","lastTransitionTime":"2025-10-13T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.966919 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.966977 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.966994 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.967018 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:23 crc kubenswrapper[4974]: I1013 18:15:23.967034 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:23Z","lastTransitionTime":"2025-10-13T18:15:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.069883 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.069952 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.069976 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.070006 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.070029 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:24Z","lastTransitionTime":"2025-10-13T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.173539 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.173592 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.173610 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.173636 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.173690 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:24Z","lastTransitionTime":"2025-10-13T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.276501 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.276568 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.276590 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.276619 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.276644 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:24Z","lastTransitionTime":"2025-10-13T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.379530 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.379593 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.379617 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.379647 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.379717 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:24Z","lastTransitionTime":"2025-10-13T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.482750 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.482804 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.482822 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.482850 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.482868 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:24Z","lastTransitionTime":"2025-10-13T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.585318 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.585375 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.585387 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.585408 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.585421 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:24Z","lastTransitionTime":"2025-10-13T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.688744 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.688804 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.688829 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.688854 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.688871 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:24Z","lastTransitionTime":"2025-10-13T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.792374 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.792436 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.792455 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.792481 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.792500 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:24Z","lastTransitionTime":"2025-10-13T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.894847 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.894893 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.895002 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.895094 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.895114 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:24Z","lastTransitionTime":"2025-10-13T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.998212 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.998274 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.998291 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.998314 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:24 crc kubenswrapper[4974]: I1013 18:15:24.998337 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:24Z","lastTransitionTime":"2025-10-13T18:15:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.101349 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.101419 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.101440 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.101481 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.101502 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:25Z","lastTransitionTime":"2025-10-13T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.204481 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.204564 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.204587 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.204621 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.204645 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:25Z","lastTransitionTime":"2025-10-13T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.308019 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.308078 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.308094 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.308117 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.308135 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:25Z","lastTransitionTime":"2025-10-13T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.410953 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.411011 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.411028 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.411055 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.411073 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:25Z","lastTransitionTime":"2025-10-13T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.513151 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.513245 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.513269 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.513301 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.513327 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:25Z","lastTransitionTime":"2025-10-13T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.616046 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.616133 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.616170 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.616201 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.616226 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:25Z","lastTransitionTime":"2025-10-13T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.718803 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.718863 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.718881 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.718904 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.718921 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:25Z","lastTransitionTime":"2025-10-13T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.810856 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.810866 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.811001 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:25 crc kubenswrapper[4974]: E1013 18:15:25.811250 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:25 crc kubenswrapper[4974]: E1013 18:15:25.811643 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:25 crc kubenswrapper[4974]: E1013 18:15:25.811889 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.811707 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:25 crc kubenswrapper[4974]: E1013 18:15:25.812471 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.821953 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.822012 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.822029 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.822051 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.822068 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:25Z","lastTransitionTime":"2025-10-13T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.834954 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:25Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.857995 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:25Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.879126 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:25Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.905821 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:25Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.924681 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.924731 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.924748 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.924772 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.924792 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:25Z","lastTransitionTime":"2025-10-13T18:15:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.929541 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:25Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.950874 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:25Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.975986 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:25Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:25 crc kubenswrapper[4974]: I1013 18:15:25.999178 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec4e67d-0eda-4b41-90f7-ef0371f5ec49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c76aeb1156f079b7cbc4c0553eeea4cf5929dc0aa0ab6e6010a9f4588ecb6f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd05f0e33255ca34350d583298cdb6e4cef6573c67613d31cf53f60d2cae2792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44540d6f9c2317f4194ee49126bed6b9cc86346979eddff48829c63853b131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:25Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.019293 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:26Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.028832 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.028899 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.028924 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.028956 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.028982 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:26Z","lastTransitionTime":"2025-10-13T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.039742 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:26Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.057606 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:26Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.073549 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:26Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.105734 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubernetes/ovnkube-node-zwcs8 after 0 failed attempt(s)\\\\nI1013 18:15:12.935117 6665 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-zwcs8\\\\nI1013 18:15:12.935120 6665 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nI1013 18:15:12.935110 6665 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI1013 18:15:12.935135 6665 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nF1013 18:15:12.935141 6665 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: cer\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:26Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.132109 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.132153 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.132171 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.132195 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.132212 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:26Z","lastTransitionTime":"2025-10-13T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.137035 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:26Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.155165 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:26Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.170087 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:26Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.187298 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:26Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.204394 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:26Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.234541 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.234591 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.234609 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.234631 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.234647 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:26Z","lastTransitionTime":"2025-10-13T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.337246 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.337310 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.337334 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.337361 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.337383 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:26Z","lastTransitionTime":"2025-10-13T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.440316 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.440384 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.440402 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.440427 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.440448 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:26Z","lastTransitionTime":"2025-10-13T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.544051 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.544114 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.544132 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.544156 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.544173 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:26Z","lastTransitionTime":"2025-10-13T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.646945 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.647014 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.647032 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.647059 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.647077 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:26Z","lastTransitionTime":"2025-10-13T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.750390 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.750455 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.750473 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.750503 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.750524 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:26Z","lastTransitionTime":"2025-10-13T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.853979 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.854057 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.854077 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.854102 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.854121 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:26Z","lastTransitionTime":"2025-10-13T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.957271 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.957344 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.957368 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.957392 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:26 crc kubenswrapper[4974]: I1013 18:15:26.957410 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:26Z","lastTransitionTime":"2025-10-13T18:15:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.060129 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.060181 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.060200 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.060229 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.060247 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:27Z","lastTransitionTime":"2025-10-13T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.163575 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.163648 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.163706 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.163733 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.163756 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:27Z","lastTransitionTime":"2025-10-13T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.266257 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.266333 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.266355 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.266383 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.266406 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:27Z","lastTransitionTime":"2025-10-13T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.368913 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.368980 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.368998 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.369024 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.369042 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:27Z","lastTransitionTime":"2025-10-13T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.472476 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.472560 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.472588 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.472618 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.472643 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:27Z","lastTransitionTime":"2025-10-13T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.576078 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.576202 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.576227 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.576257 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.576279 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:27Z","lastTransitionTime":"2025-10-13T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.679348 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.679406 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.679424 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.679448 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.679467 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:27Z","lastTransitionTime":"2025-10-13T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.782928 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.783018 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.783035 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.783060 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.783081 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:27Z","lastTransitionTime":"2025-10-13T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.815708 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:27 crc kubenswrapper[4974]: E1013 18:15:27.815873 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.816157 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:27 crc kubenswrapper[4974]: E1013 18:15:27.816258 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.816460 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:27 crc kubenswrapper[4974]: E1013 18:15:27.816554 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.817100 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:27 crc kubenswrapper[4974]: E1013 18:15:27.817213 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.886768 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.886825 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.886841 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.886866 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.886882 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:27Z","lastTransitionTime":"2025-10-13T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.989583 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.989646 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.989692 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.989716 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:27 crc kubenswrapper[4974]: I1013 18:15:27.989733 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:27Z","lastTransitionTime":"2025-10-13T18:15:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.093451 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.093513 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.093529 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.093553 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.093571 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:28Z","lastTransitionTime":"2025-10-13T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.197711 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.197883 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.197937 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.197965 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.198019 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:28Z","lastTransitionTime":"2025-10-13T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.300598 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.300679 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.300700 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.300727 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.300745 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:28Z","lastTransitionTime":"2025-10-13T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.403624 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.403739 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.403755 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.403780 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.403796 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:28Z","lastTransitionTime":"2025-10-13T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.507174 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.507248 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.507275 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.507306 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.507328 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:28Z","lastTransitionTime":"2025-10-13T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.611024 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.611104 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.611126 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.611153 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.611172 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:28Z","lastTransitionTime":"2025-10-13T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.714459 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.714542 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.714564 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.714593 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.714610 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:28Z","lastTransitionTime":"2025-10-13T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.817271 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.817335 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.817358 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.817385 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.817402 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:28Z","lastTransitionTime":"2025-10-13T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.921204 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.921623 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.921649 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.921713 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:28 crc kubenswrapper[4974]: I1013 18:15:28.921737 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:28Z","lastTransitionTime":"2025-10-13T18:15:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.024913 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.024980 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.024999 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.025024 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.025042 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:29Z","lastTransitionTime":"2025-10-13T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.127251 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.127302 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.127323 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.127341 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.127355 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:29Z","lastTransitionTime":"2025-10-13T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.229998 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.230068 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.230087 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.230112 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.230130 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:29Z","lastTransitionTime":"2025-10-13T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.333097 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.333141 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.333153 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.333173 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.333185 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:29Z","lastTransitionTime":"2025-10-13T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.435941 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.436031 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.436056 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.436090 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.436115 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:29Z","lastTransitionTime":"2025-10-13T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.540766 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.540845 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.540870 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.540901 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.540937 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:29Z","lastTransitionTime":"2025-10-13T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.643974 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.644040 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.644059 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.644084 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.644102 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:29Z","lastTransitionTime":"2025-10-13T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.747023 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.747082 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.747102 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.747124 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.747141 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:29Z","lastTransitionTime":"2025-10-13T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.810876 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.810883 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.811023 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:29 crc kubenswrapper[4974]: E1013 18:15:29.811212 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.811314 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:29 crc kubenswrapper[4974]: E1013 18:15:29.811419 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:29 crc kubenswrapper[4974]: E1013 18:15:29.811541 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:29 crc kubenswrapper[4974]: E1013 18:15:29.812197 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.812824 4974 scope.go:117] "RemoveContainer" containerID="d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21" Oct 13 18:15:29 crc kubenswrapper[4974]: E1013 18:15:29.813260 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.849952 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.850016 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.850040 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.850068 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.850091 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:29Z","lastTransitionTime":"2025-10-13T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.953294 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.953332 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.953341 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.953359 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:29 crc kubenswrapper[4974]: I1013 18:15:29.953367 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:29Z","lastTransitionTime":"2025-10-13T18:15:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.056386 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.056467 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.056493 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.056524 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.056547 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:30Z","lastTransitionTime":"2025-10-13T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.159066 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.159114 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.159127 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.159143 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.159157 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:30Z","lastTransitionTime":"2025-10-13T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.261677 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.261718 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.261727 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.261746 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.261757 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:30Z","lastTransitionTime":"2025-10-13T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.393556 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.393607 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.393621 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.393642 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.393673 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:30Z","lastTransitionTime":"2025-10-13T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.496367 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.496428 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.496442 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.496463 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.496477 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:30Z","lastTransitionTime":"2025-10-13T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.599471 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.599537 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.599547 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.599569 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.599581 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:30Z","lastTransitionTime":"2025-10-13T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.702826 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.702913 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.702925 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.702951 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.702970 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:30Z","lastTransitionTime":"2025-10-13T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.806176 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.806254 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.806279 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.806343 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.806373 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:30Z","lastTransitionTime":"2025-10-13T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.908783 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.908815 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.908829 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.908848 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:30 crc kubenswrapper[4974]: I1013 18:15:30.908862 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:30Z","lastTransitionTime":"2025-10-13T18:15:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.012063 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.012102 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.012112 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.012128 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.012139 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:31Z","lastTransitionTime":"2025-10-13T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.114364 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.114428 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.114448 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.114477 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.114497 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:31Z","lastTransitionTime":"2025-10-13T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.217072 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.217107 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.217119 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.217135 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.217145 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:31Z","lastTransitionTime":"2025-10-13T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.319195 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.319257 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.319276 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.319308 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.319328 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:31Z","lastTransitionTime":"2025-10-13T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.422154 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.422208 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.422220 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.422238 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.422250 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:31Z","lastTransitionTime":"2025-10-13T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.524932 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.524960 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.524969 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.524983 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.524992 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:31Z","lastTransitionTime":"2025-10-13T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.628301 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.628341 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.628349 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.628363 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.628374 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:31Z","lastTransitionTime":"2025-10-13T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.731048 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.731081 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.731091 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.731106 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.731118 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:31Z","lastTransitionTime":"2025-10-13T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.811483 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.811565 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.811495 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.811623 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:31 crc kubenswrapper[4974]: E1013 18:15:31.811749 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:31 crc kubenswrapper[4974]: E1013 18:15:31.811870 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:31 crc kubenswrapper[4974]: E1013 18:15:31.811935 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:31 crc kubenswrapper[4974]: E1013 18:15:31.811999 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.833580 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.833637 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.833678 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.833700 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.833720 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:31Z","lastTransitionTime":"2025-10-13T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.936048 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.936089 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.936101 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.936155 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:31 crc kubenswrapper[4974]: I1013 18:15:31.936169 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:31Z","lastTransitionTime":"2025-10-13T18:15:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.039262 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.039324 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.039335 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.039352 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.039365 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.142254 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.142315 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.142332 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.142357 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.142377 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.244884 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.244954 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.244965 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.244985 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.244997 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.348058 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.348119 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.348144 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.348176 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.348198 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.449591 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.449693 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.449713 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.449739 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.449759 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.552231 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.552275 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.552286 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.552303 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.552314 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.655227 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.655279 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.655292 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.655312 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.655325 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.757843 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.757893 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.757911 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.757932 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.757948 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.861314 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.861361 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.861377 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.861398 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.861414 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.913619 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.913719 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.913738 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.913766 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.913785 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: E1013 18:15:32.932234 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.936782 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.936856 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.936874 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.936899 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.936928 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: E1013 18:15:32.950053 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.954764 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.954799 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.954810 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.954826 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.954837 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: E1013 18:15:32.968426 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.972683 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.972717 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.972726 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.972739 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.972748 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:32 crc kubenswrapper[4974]: E1013 18:15:32.989322 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:32Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.993433 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.993462 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.993503 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.993524 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:32 crc kubenswrapper[4974]: I1013 18:15:32.993534 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:32Z","lastTransitionTime":"2025-10-13T18:15:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:33 crc kubenswrapper[4974]: E1013 18:15:33.010066 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:33Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:33 crc kubenswrapper[4974]: E1013 18:15:33.010277 4974 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.013646 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.013699 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.013710 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.013727 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.013741 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:33Z","lastTransitionTime":"2025-10-13T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.116908 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.116966 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.116984 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.117008 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.117025 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:33Z","lastTransitionTime":"2025-10-13T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.219637 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.219703 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.219715 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.219731 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.219742 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:33Z","lastTransitionTime":"2025-10-13T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.323358 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.323421 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.323439 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.323467 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.323484 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:33Z","lastTransitionTime":"2025-10-13T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.425827 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.425869 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.425879 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.425894 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.425903 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:33Z","lastTransitionTime":"2025-10-13T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.528173 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.528243 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.528268 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.528298 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.528319 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:33Z","lastTransitionTime":"2025-10-13T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.630913 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.630979 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.631004 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.631029 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.631047 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:33Z","lastTransitionTime":"2025-10-13T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.735446 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.735498 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.735516 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.735540 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.735558 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:33Z","lastTransitionTime":"2025-10-13T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.811126 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.811157 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.811320 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:33 crc kubenswrapper[4974]: E1013 18:15:33.811317 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:33 crc kubenswrapper[4974]: E1013 18:15:33.811510 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:33 crc kubenswrapper[4974]: E1013 18:15:33.811643 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.811862 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:33 crc kubenswrapper[4974]: E1013 18:15:33.811997 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.838790 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.838847 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.838866 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.838892 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.838909 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:33Z","lastTransitionTime":"2025-10-13T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.929931 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:33 crc kubenswrapper[4974]: E1013 18:15:33.930097 4974 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:33 crc kubenswrapper[4974]: E1013 18:15:33.930164 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs podName:a260247c-2399-42b5-bddc-73e38659680b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:05.930146691 +0000 UTC m=+100.834512781 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs") pod "network-metrics-daemon-z9hj4" (UID: "a260247c-2399-42b5-bddc-73e38659680b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.941839 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.941882 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.941895 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.941914 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:33 crc kubenswrapper[4974]: I1013 18:15:33.941928 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:33Z","lastTransitionTime":"2025-10-13T18:15:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.044097 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.044135 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.044143 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.044158 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.044167 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:34Z","lastTransitionTime":"2025-10-13T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.147079 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.147127 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.147138 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.147157 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.147169 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:34Z","lastTransitionTime":"2025-10-13T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.250432 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.250476 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.250489 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.250507 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.250518 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:34Z","lastTransitionTime":"2025-10-13T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.353668 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.353725 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.353739 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.353762 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.353776 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:34Z","lastTransitionTime":"2025-10-13T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.458041 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.458132 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.458154 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.458180 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.458199 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:34Z","lastTransitionTime":"2025-10-13T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.561523 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.561588 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.561601 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.561626 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.561696 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:34Z","lastTransitionTime":"2025-10-13T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.665076 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.665180 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.665211 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.665254 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.665282 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:34Z","lastTransitionTime":"2025-10-13T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.768027 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.768072 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.768085 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.768102 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.768115 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:34Z","lastTransitionTime":"2025-10-13T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.870414 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.870453 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.870466 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.870483 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.870497 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:34Z","lastTransitionTime":"2025-10-13T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.973307 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.973375 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.973392 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.973420 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:34 crc kubenswrapper[4974]: I1013 18:15:34.973437 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:34Z","lastTransitionTime":"2025-10-13T18:15:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.077795 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.077864 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.077881 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.077906 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.077924 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:35Z","lastTransitionTime":"2025-10-13T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.180417 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.180470 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.180488 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.180512 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.180529 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:35Z","lastTransitionTime":"2025-10-13T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.270799 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xcspx_9c38c0e3-9bee-402b-adf0-27ac9e31c0f0/kube-multus/0.log" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.270871 4974 generic.go:334] "Generic (PLEG): container finished" podID="9c38c0e3-9bee-402b-adf0-27ac9e31c0f0" containerID="e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490" exitCode=1 Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.270959 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xcspx" event={"ID":"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0","Type":"ContainerDied","Data":"e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490"} Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.271642 4974 scope.go:117] "RemoveContainer" containerID="e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.284301 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.284346 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.284360 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.284378 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.284388 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:35Z","lastTransitionTime":"2025-10-13T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.286547 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.305956 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.320528 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.333397 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.351126 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.367625 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"2025-10-13T18:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706\\\\n2025-10-13T18:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706 to /host/opt/cni/bin/\\\\n2025-10-13T18:14:50Z [verbose] multus-daemon started\\\\n2025-10-13T18:14:50Z [verbose] Readiness Indicator file check\\\\n2025-10-13T18:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.379084 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.386968 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.387026 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.387039 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.387064 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.387077 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:35Z","lastTransitionTime":"2025-10-13T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.395032 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.408667 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.422152 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.431752 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.451050 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubernetes/ovnkube-node-zwcs8 after 0 failed attempt(s)\\\\nI1013 18:15:12.935117 6665 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-zwcs8\\\\nI1013 18:15:12.935120 6665 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nI1013 18:15:12.935110 6665 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI1013 18:15:12.935135 6665 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nF1013 18:15:12.935141 6665 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: cer\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.463361 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec4e67d-0eda-4b41-90f7-ef0371f5ec49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c76aeb1156f079b7cbc4c0553eeea4cf5929dc0aa0ab6e6010a9f4588ecb6f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd05f0e33255ca34350d583298cdb6e4cef6573c67613d31cf53f60d2cae2792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44540d6f9c2317f4194ee49126bed6b9cc86346979eddff48829c63853b131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.485602 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.489286 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.489321 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.489333 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.489349 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.489361 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:35Z","lastTransitionTime":"2025-10-13T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.497495 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.506365 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.516769 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.528226 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.592016 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.592044 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.592052 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.592065 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.592074 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:35Z","lastTransitionTime":"2025-10-13T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.694574 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.694606 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.694615 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.694631 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.694641 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:35Z","lastTransitionTime":"2025-10-13T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.796915 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.796951 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.796968 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.796992 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.797004 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:35Z","lastTransitionTime":"2025-10-13T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.810671 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:35 crc kubenswrapper[4974]: E1013 18:15:35.810772 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.810874 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.810966 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:35 crc kubenswrapper[4974]: E1013 18:15:35.811019 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:35 crc kubenswrapper[4974]: E1013 18:15:35.811116 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.810873 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:35 crc kubenswrapper[4974]: E1013 18:15:35.811232 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.827561 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.848942 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.866664 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.883485 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.896865 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.898981 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.899187 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.899363 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.899495 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.899755 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:35Z","lastTransitionTime":"2025-10-13T18:15:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.924629 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.944245 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"2025-10-13T18:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706\\\\n2025-10-13T18:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706 to /host/opt/cni/bin/\\\\n2025-10-13T18:14:50Z [verbose] multus-daemon started\\\\n2025-10-13T18:14:50Z [verbose] Readiness Indicator file check\\\\n2025-10-13T18:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.971719 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubernetes/ovnkube-node-zwcs8 after 0 failed attempt(s)\\\\nI1013 18:15:12.935117 6665 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-zwcs8\\\\nI1013 18:15:12.935120 6665 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nI1013 18:15:12.935110 6665 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI1013 18:15:12.935135 6665 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nF1013 18:15:12.935141 6665 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: cer\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:35 crc kubenswrapper[4974]: I1013 18:15:35.986229 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec4e67d-0eda-4b41-90f7-ef0371f5ec49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c76aeb1156f079b7cbc4c0553eeea4cf5929dc0aa0ab6e6010a9f4588ecb6f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd05f0e33255ca34350d583298cdb6e4cef6573c67613d31cf53f60d2cae2792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44540d6f9c2317f4194ee49126bed6b9cc86346979eddff48829c63853b131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.000942 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:35Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.002275 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.002383 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.002446 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.002515 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.002580 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:36Z","lastTransitionTime":"2025-10-13T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.013864 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.023864 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.033143 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.042747 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.060056 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.073639 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.086882 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.102833 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.105110 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.105160 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.105175 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.105193 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.105206 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:36Z","lastTransitionTime":"2025-10-13T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.207890 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.207957 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.207977 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.208023 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.208053 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:36Z","lastTransitionTime":"2025-10-13T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.276089 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xcspx_9c38c0e3-9bee-402b-adf0-27ac9e31c0f0/kube-multus/0.log" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.276168 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xcspx" event={"ID":"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0","Type":"ContainerStarted","Data":"e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94"} Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.294821 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.307460 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.316474 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.316569 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.317078 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.317169 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.317454 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:36Z","lastTransitionTime":"2025-10-13T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.337053 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.359263 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.376576 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.396512 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.411314 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.420356 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.420395 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.420412 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.420434 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.420452 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:36Z","lastTransitionTime":"2025-10-13T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.426456 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.442722 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.460386 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.476049 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.499813 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"2025-10-13T18:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706\\\\n2025-10-13T18:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706 to /host/opt/cni/bin/\\\\n2025-10-13T18:14:50Z [verbose] multus-daemon started\\\\n2025-10-13T18:14:50Z [verbose] Readiness Indicator file check\\\\n2025-10-13T18:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.514457 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.523070 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.523130 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.523148 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.523172 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.523190 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:36Z","lastTransitionTime":"2025-10-13T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.536406 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubernetes/ovnkube-node-zwcs8 after 0 failed attempt(s)\\\\nI1013 18:15:12.935117 6665 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-zwcs8\\\\nI1013 18:15:12.935120 6665 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nI1013 18:15:12.935110 6665 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI1013 18:15:12.935135 6665 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nF1013 18:15:12.935141 6665 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: cer\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.554055 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec4e67d-0eda-4b41-90f7-ef0371f5ec49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c76aeb1156f079b7cbc4c0553eeea4cf5929dc0aa0ab6e6010a9f4588ecb6f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd05f0e33255ca34350d583298cdb6e4cef6573c67613d31cf53f60d2cae2792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44540d6f9c2317f4194ee49126bed6b9cc86346979eddff48829c63853b131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.570871 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.588355 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.624824 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.624882 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.624905 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.624927 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.624944 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:36Z","lastTransitionTime":"2025-10-13T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.627980 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:36Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.727254 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.727305 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.727322 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.727345 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.727361 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:36Z","lastTransitionTime":"2025-10-13T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.829963 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.829996 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.830008 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.830021 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.830032 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:36Z","lastTransitionTime":"2025-10-13T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.932820 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.932913 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.932938 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.932967 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:36 crc kubenswrapper[4974]: I1013 18:15:36.932991 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:36Z","lastTransitionTime":"2025-10-13T18:15:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.034763 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.034805 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.034818 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.034834 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.034846 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:37Z","lastTransitionTime":"2025-10-13T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.137221 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.137291 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.137312 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.137340 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.137363 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:37Z","lastTransitionTime":"2025-10-13T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.240558 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.240598 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.240608 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.240623 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.240634 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:37Z","lastTransitionTime":"2025-10-13T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.343758 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.343831 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.343848 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.344347 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.344432 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:37Z","lastTransitionTime":"2025-10-13T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.447549 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.447582 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.447592 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.447608 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.447618 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:37Z","lastTransitionTime":"2025-10-13T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.550614 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.550691 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.550716 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.550745 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.550767 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:37Z","lastTransitionTime":"2025-10-13T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.653806 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.653869 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.653888 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.653912 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.653931 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:37Z","lastTransitionTime":"2025-10-13T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.756891 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.756933 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.756945 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.756960 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.756972 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:37Z","lastTransitionTime":"2025-10-13T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.811211 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.811254 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.811303 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:37 crc kubenswrapper[4974]: E1013 18:15:37.811562 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:37 crc kubenswrapper[4974]: E1013 18:15:37.811754 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:37 crc kubenswrapper[4974]: E1013 18:15:37.812013 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.811378 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:37 crc kubenswrapper[4974]: E1013 18:15:37.812612 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.859234 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.859272 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.859283 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.859299 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.859311 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:37Z","lastTransitionTime":"2025-10-13T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.962983 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.963059 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.963082 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.963113 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:37 crc kubenswrapper[4974]: I1013 18:15:37.963153 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:37Z","lastTransitionTime":"2025-10-13T18:15:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.065521 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.065580 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.065597 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.065621 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.065638 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:38Z","lastTransitionTime":"2025-10-13T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.168017 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.168062 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.168078 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.168102 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.168118 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:38Z","lastTransitionTime":"2025-10-13T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.274962 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.275017 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.275033 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.275057 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.275074 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:38Z","lastTransitionTime":"2025-10-13T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.378342 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.378384 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.378399 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.378420 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.378436 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:38Z","lastTransitionTime":"2025-10-13T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.480536 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.480579 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.480595 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.480617 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.480634 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:38Z","lastTransitionTime":"2025-10-13T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.582753 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.582815 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.582835 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.582861 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.582881 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:38Z","lastTransitionTime":"2025-10-13T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.686024 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.686082 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.686099 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.686122 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.686141 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:38Z","lastTransitionTime":"2025-10-13T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.789296 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.789356 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.789374 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.789398 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.789415 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:38Z","lastTransitionTime":"2025-10-13T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.891628 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.891707 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.891724 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.891747 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.891767 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:38Z","lastTransitionTime":"2025-10-13T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.993778 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.993835 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.993853 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.993881 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:38 crc kubenswrapper[4974]: I1013 18:15:38.993900 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:38Z","lastTransitionTime":"2025-10-13T18:15:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.096163 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.096417 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.096433 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.096446 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.096455 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:39Z","lastTransitionTime":"2025-10-13T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.199821 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.199871 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.199892 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.199915 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.199932 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:39Z","lastTransitionTime":"2025-10-13T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.302284 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.302372 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.302388 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.302410 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.302428 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:39Z","lastTransitionTime":"2025-10-13T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.405470 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.405513 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.405530 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.405550 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.405567 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:39Z","lastTransitionTime":"2025-10-13T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.509021 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.509085 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.509110 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.509139 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.509159 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:39Z","lastTransitionTime":"2025-10-13T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.612488 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.612534 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.612551 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.612577 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.612594 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:39Z","lastTransitionTime":"2025-10-13T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.715248 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.715311 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.715334 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.715363 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.715385 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:39Z","lastTransitionTime":"2025-10-13T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.810779 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:39 crc kubenswrapper[4974]: E1013 18:15:39.810970 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.811075 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:39 crc kubenswrapper[4974]: E1013 18:15:39.811131 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.811170 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:39 crc kubenswrapper[4974]: E1013 18:15:39.811211 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.811249 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:39 crc kubenswrapper[4974]: E1013 18:15:39.811283 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.816948 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.817019 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.817042 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.817422 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.817768 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:39Z","lastTransitionTime":"2025-10-13T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.920758 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.920810 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.920827 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.920849 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:39 crc kubenswrapper[4974]: I1013 18:15:39.920865 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:39Z","lastTransitionTime":"2025-10-13T18:15:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.023297 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.023361 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.023384 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.023412 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.023433 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:40Z","lastTransitionTime":"2025-10-13T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.126387 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.126437 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.126455 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.126477 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.126498 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:40Z","lastTransitionTime":"2025-10-13T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.230186 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.230234 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.230253 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.230276 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.230294 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:40Z","lastTransitionTime":"2025-10-13T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.333164 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.333215 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.333232 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.333257 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.333274 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:40Z","lastTransitionTime":"2025-10-13T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.436426 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.436474 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.436490 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.436512 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.436530 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:40Z","lastTransitionTime":"2025-10-13T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.540318 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.540371 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.540389 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.540412 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.540431 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:40Z","lastTransitionTime":"2025-10-13T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.643070 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.643116 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.643132 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.643155 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.643173 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:40Z","lastTransitionTime":"2025-10-13T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.745765 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.745814 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.745830 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.745853 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.745869 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:40Z","lastTransitionTime":"2025-10-13T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.813284 4974 scope.go:117] "RemoveContainer" containerID="d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.848387 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.848440 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.848458 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.848481 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.848499 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:40Z","lastTransitionTime":"2025-10-13T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.951521 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.951814 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.951961 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.952159 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:40 crc kubenswrapper[4974]: I1013 18:15:40.952317 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:40Z","lastTransitionTime":"2025-10-13T18:15:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.056130 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.056184 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.056209 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.056241 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.056265 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:41Z","lastTransitionTime":"2025-10-13T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.159890 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.159940 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.159955 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.159976 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.159994 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:41Z","lastTransitionTime":"2025-10-13T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.263927 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.264001 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.264018 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.264045 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.264062 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:41Z","lastTransitionTime":"2025-10-13T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.295554 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/2.log" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.300211 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f"} Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.301006 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.324427 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.341408 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.360033 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"2025-10-13T18:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706\\\\n2025-10-13T18:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706 to /host/opt/cni/bin/\\\\n2025-10-13T18:14:50Z [verbose] multus-daemon started\\\\n2025-10-13T18:14:50Z [verbose] Readiness Indicator file check\\\\n2025-10-13T18:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.368177 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.368231 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.368249 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.368275 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.368293 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:41Z","lastTransitionTime":"2025-10-13T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.373937 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.398909 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubernetes/ovnkube-node-zwcs8 after 0 failed attempt(s)\\\\nI1013 18:15:12.935117 6665 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-zwcs8\\\\nI1013 18:15:12.935120 6665 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nI1013 18:15:12.935110 6665 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI1013 18:15:12.935135 6665 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nF1013 18:15:12.935141 6665 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: cer\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.420906 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec4e67d-0eda-4b41-90f7-ef0371f5ec49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c76aeb1156f079b7cbc4c0553eeea4cf5929dc0aa0ab6e6010a9f4588ecb6f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd05f0e33255ca34350d583298cdb6e4cef6573c67613d31cf53f60d2cae2792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44540d6f9c2317f4194ee49126bed6b9cc86346979eddff48829c63853b131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.443810 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.468989 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.471027 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.471083 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.471100 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.471127 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.471144 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:41Z","lastTransitionTime":"2025-10-13T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.492297 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.564286 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.574172 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.574220 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.574237 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.574257 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.574273 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:41Z","lastTransitionTime":"2025-10-13T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.582719 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.621345 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.635211 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.644320 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.655574 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.667223 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.676447 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.676471 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.676478 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.676492 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.676500 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:41Z","lastTransitionTime":"2025-10-13T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.682230 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.695168 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:41Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.778401 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.778446 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.778457 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.778472 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.778484 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:41Z","lastTransitionTime":"2025-10-13T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.811215 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.811248 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.811283 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.811216 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:41 crc kubenswrapper[4974]: E1013 18:15:41.811332 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:41 crc kubenswrapper[4974]: E1013 18:15:41.811436 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:41 crc kubenswrapper[4974]: E1013 18:15:41.811730 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:41 crc kubenswrapper[4974]: E1013 18:15:41.811707 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.881418 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.881459 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.881473 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.881489 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.881502 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:41Z","lastTransitionTime":"2025-10-13T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.984203 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.984246 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.984256 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.984272 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:41 crc kubenswrapper[4974]: I1013 18:15:41.984283 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:41Z","lastTransitionTime":"2025-10-13T18:15:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.086776 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.086823 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.086834 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.086855 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.086867 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:42Z","lastTransitionTime":"2025-10-13T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.189888 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.189955 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.189978 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.190007 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.190028 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:42Z","lastTransitionTime":"2025-10-13T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.293344 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.293416 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.293435 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.293461 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.293480 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:42Z","lastTransitionTime":"2025-10-13T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.306051 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/3.log" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.307355 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/2.log" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.311354 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" exitCode=1 Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.311419 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f"} Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.311482 4974 scope.go:117] "RemoveContainer" containerID="d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.312848 4974 scope.go:117] "RemoveContainer" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" Oct 13 18:15:42 crc kubenswrapper[4974]: E1013 18:15:42.315406 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.336734 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.356049 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.376786 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"2025-10-13T18:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706\\\\n2025-10-13T18:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706 to /host/opt/cni/bin/\\\\n2025-10-13T18:14:50Z [verbose] multus-daemon started\\\\n2025-10-13T18:14:50Z [verbose] Readiness Indicator file check\\\\n2025-10-13T18:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.391817 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.395968 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.396015 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.396032 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.396058 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.396076 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:42Z","lastTransitionTime":"2025-10-13T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.422909 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d377be5f40e9975cb7ae6b14fe096d81416e6fdcb5babab4d301f15adaf2ae21\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:12Z\\\",\\\"message\\\":\\\"kubernetes/ovnkube-node-zwcs8 after 0 failed attempt(s)\\\\nI1013 18:15:12.935117 6665 default_network_controller.go:776] Recording success event on pod openshift-ovn-kubernetes/ovnkube-node-zwcs8\\\\nI1013 18:15:12.935120 6665 obj_retry.go:303] Retry object setup: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nI1013 18:15:12.935110 6665 loadbalancer.go:304] Deleted 0 stale LBs for map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-network-console/networking-console-plugin\\\\\\\"}\\\\nI1013 18:15:12.935135 6665 obj_retry.go:365] Adding new object: *v1.Pod openshift-image-registry/node-ca-98z75\\\\nF1013 18:15:12.935141 6665 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: cer\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:11Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:42Z\\\",\\\"message\\\":\\\"s/externalversions/factory.go:140\\\\nI1013 18:15:42.028539 7039 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1013 18:15:42.029006 7039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:15:42.029033 7039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:15:42.029082 7039 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:15:42.029083 7039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:15:42.029108 7039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:15:42.029127 7039 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:15:42.029135 7039 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:15:42.029155 7039 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:15:42.029166 7039 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:15:42.029202 7039 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:15:42.029219 7039 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:15:42.029256 7039 factory.go:656] Stopping watch factory\\\\nI1013 18:15:42.029276 7039 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.443877 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec4e67d-0eda-4b41-90f7-ef0371f5ec49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c76aeb1156f079b7cbc4c0553eeea4cf5929dc0aa0ab6e6010a9f4588ecb6f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd05f0e33255ca34350d583298cdb6e4cef6573c67613d31cf53f60d2cae2792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44540d6f9c2317f4194ee49126bed6b9cc86346979eddff48829c63853b131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.463011 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.484169 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.498510 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.498555 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.498572 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.498595 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.498611 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:42Z","lastTransitionTime":"2025-10-13T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.502124 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.520194 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.536841 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.567134 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.590279 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.602159 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.602241 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.602267 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.602301 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.602326 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:42Z","lastTransitionTime":"2025-10-13T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.608034 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.629402 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.652466 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.681852 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.706414 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.706536 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.706562 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.706597 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.706621 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:42Z","lastTransitionTime":"2025-10-13T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.706699 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:42Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.810076 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.810170 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.810199 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.810234 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.810258 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:42Z","lastTransitionTime":"2025-10-13T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.913722 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.913784 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.913804 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.913830 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:42 crc kubenswrapper[4974]: I1013 18:15:42.913849 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:42Z","lastTransitionTime":"2025-10-13T18:15:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.017081 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.017144 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.017164 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.017190 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.017209 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.120433 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.120504 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.120516 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.120541 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.120556 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.144174 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.144249 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.144269 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.144298 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.144320 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: E1013 18:15:43.166855 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.172237 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.172307 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.172332 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.172365 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.172389 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: E1013 18:15:43.195796 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.201876 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.201957 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.201993 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.202023 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.202037 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: E1013 18:15:43.228159 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.232972 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.233047 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.233069 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.233095 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.233113 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: E1013 18:15:43.249838 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.255722 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.255771 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.255784 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.255805 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.255819 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: E1013 18:15:43.276083 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: E1013 18:15:43.276311 4974 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.278358 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.278450 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.278466 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.278485 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.278497 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.318150 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/3.log" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.323592 4974 scope.go:117] "RemoveContainer" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" Oct 13 18:15:43 crc kubenswrapper[4974]: E1013 18:15:43.323961 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.339737 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.361774 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.382913 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.383011 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.383082 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.383098 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.383120 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.383133 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.402824 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.425139 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.443537 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.464397 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"2025-10-13T18:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706\\\\n2025-10-13T18:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706 to /host/opt/cni/bin/\\\\n2025-10-13T18:14:50Z [verbose] multus-daemon started\\\\n2025-10-13T18:14:50Z [verbose] Readiness Indicator file check\\\\n2025-10-13T18:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.483060 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec4e67d-0eda-4b41-90f7-ef0371f5ec49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c76aeb1156f079b7cbc4c0553eeea4cf5929dc0aa0ab6e6010a9f4588ecb6f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd05f0e33255ca34350d583298cdb6e4cef6573c67613d31cf53f60d2cae2792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44540d6f9c2317f4194ee49126bed6b9cc86346979eddff48829c63853b131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.487168 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.487207 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.487223 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.487249 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.487269 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.500132 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.518514 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.533692 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.550263 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.582235 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:42Z\\\",\\\"message\\\":\\\"s/externalversions/factory.go:140\\\\nI1013 18:15:42.028539 7039 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1013 18:15:42.029006 7039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:15:42.029033 7039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:15:42.029082 7039 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:15:42.029083 7039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:15:42.029108 7039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:15:42.029127 7039 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:15:42.029135 7039 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:15:42.029155 7039 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:15:42.029166 7039 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:15:42.029202 7039 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:15:42.029219 7039 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:15:42.029256 7039 factory.go:656] Stopping watch factory\\\\nI1013 18:15:42.029276 7039 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.595441 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.595495 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.595514 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.595539 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.595557 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.621892 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.642114 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.656791 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.676010 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.693088 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:43Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.698204 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.698268 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.698286 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.698313 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.698330 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.801248 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.801319 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.801338 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.801364 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.801385 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.811758 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:43 crc kubenswrapper[4974]: E1013 18:15:43.811912 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.811911 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:43 crc kubenswrapper[4974]: E1013 18:15:43.812041 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.811918 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.812077 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:43 crc kubenswrapper[4974]: E1013 18:15:43.812146 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:43 crc kubenswrapper[4974]: E1013 18:15:43.812360 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.905289 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.905366 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.905387 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.905413 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:43 crc kubenswrapper[4974]: I1013 18:15:43.905432 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:43Z","lastTransitionTime":"2025-10-13T18:15:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.008707 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.008767 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.008784 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.008810 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.008827 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:44Z","lastTransitionTime":"2025-10-13T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.112331 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.112393 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.112410 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.112436 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.112454 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:44Z","lastTransitionTime":"2025-10-13T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.215627 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.215721 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.215740 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.215767 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.215785 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:44Z","lastTransitionTime":"2025-10-13T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.317835 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.317872 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.317881 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.317894 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.317905 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:44Z","lastTransitionTime":"2025-10-13T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.421000 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.421086 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.421107 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.421137 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.421190 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:44Z","lastTransitionTime":"2025-10-13T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.524772 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.524820 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.524837 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.524865 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.524883 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:44Z","lastTransitionTime":"2025-10-13T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.628669 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.628735 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.628748 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.628772 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.628787 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:44Z","lastTransitionTime":"2025-10-13T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.731722 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.731799 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.731817 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.731846 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.731864 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:44Z","lastTransitionTime":"2025-10-13T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.835302 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.835362 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.835410 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.835437 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.835454 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:44Z","lastTransitionTime":"2025-10-13T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.938991 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.939076 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.939097 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.939123 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:44 crc kubenswrapper[4974]: I1013 18:15:44.939141 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:44Z","lastTransitionTime":"2025-10-13T18:15:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.042090 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.042170 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.042188 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.042214 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.042238 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:45Z","lastTransitionTime":"2025-10-13T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.145832 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.145909 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.145926 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.145954 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.145972 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:45Z","lastTransitionTime":"2025-10-13T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.249301 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.249377 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.249399 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.249424 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.249484 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:45Z","lastTransitionTime":"2025-10-13T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.352545 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.352609 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.352631 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.352684 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.352702 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:45Z","lastTransitionTime":"2025-10-13T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.455894 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.456026 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.456044 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.456069 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.456118 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:45Z","lastTransitionTime":"2025-10-13T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.559751 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.559846 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.559863 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.559919 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.559936 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:45Z","lastTransitionTime":"2025-10-13T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.663003 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.663059 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.663076 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.663099 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.663117 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:45Z","lastTransitionTime":"2025-10-13T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.769973 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.770052 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.770084 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.770115 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.770138 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:45Z","lastTransitionTime":"2025-10-13T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.813907 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:45 crc kubenswrapper[4974]: E1013 18:15:45.814116 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.814559 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.814629 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:45 crc kubenswrapper[4974]: E1013 18:15:45.814767 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.814848 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:45 crc kubenswrapper[4974]: E1013 18:15:45.814990 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:45 crc kubenswrapper[4974]: E1013 18:15:45.815089 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.837794 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:42Z\\\",\\\"message\\\":\\\"s/externalversions/factory.go:140\\\\nI1013 18:15:42.028539 7039 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1013 18:15:42.029006 7039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:15:42.029033 7039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:15:42.029082 7039 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:15:42.029083 7039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:15:42.029108 7039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:15:42.029127 7039 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:15:42.029135 7039 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:15:42.029155 7039 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:15:42.029166 7039 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:15:42.029202 7039 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:15:42.029219 7039 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:15:42.029256 7039 factory.go:656] Stopping watch factory\\\\nI1013 18:15:42.029276 7039 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.858309 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec4e67d-0eda-4b41-90f7-ef0371f5ec49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c76aeb1156f079b7cbc4c0553eeea4cf5929dc0aa0ab6e6010a9f4588ecb6f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd05f0e33255ca34350d583298cdb6e4cef6573c67613d31cf53f60d2cae2792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44540d6f9c2317f4194ee49126bed6b9cc86346979eddff48829c63853b131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.873374 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.873457 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.873476 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.873534 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.873555 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:45Z","lastTransitionTime":"2025-10-13T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.876440 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.896757 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.915859 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.931762 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.947827 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.977503 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.977602 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.977622 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.977716 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.977737 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:45Z","lastTransitionTime":"2025-10-13T18:15:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:45 crc kubenswrapper[4974]: I1013 18:15:45.981459 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.001707 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:45Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.019741 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:46Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.038505 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:46Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.061503 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:46Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.081104 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.081183 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.081207 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.081239 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.081263 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:46Z","lastTransitionTime":"2025-10-13T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.083917 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:46Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.106448 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:46Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.127895 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:46Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.151103 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:46Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.169138 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:46Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.184759 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.184849 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.184864 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.184882 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.184915 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:46Z","lastTransitionTime":"2025-10-13T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.189719 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"2025-10-13T18:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706\\\\n2025-10-13T18:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706 to /host/opt/cni/bin/\\\\n2025-10-13T18:14:50Z [verbose] multus-daemon started\\\\n2025-10-13T18:14:50Z [verbose] Readiness Indicator file check\\\\n2025-10-13T18:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:46Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.286981 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.287067 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.287079 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.287099 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.287111 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:46Z","lastTransitionTime":"2025-10-13T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.390464 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.390528 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.390549 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.390574 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.390594 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:46Z","lastTransitionTime":"2025-10-13T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.493264 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.493329 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.493347 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.493371 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.493389 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:46Z","lastTransitionTime":"2025-10-13T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.596308 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.596371 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.596389 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.596413 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.596432 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:46Z","lastTransitionTime":"2025-10-13T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.699808 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.699872 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.699889 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.699913 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.699932 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:46Z","lastTransitionTime":"2025-10-13T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.802787 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.802849 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.802866 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.802890 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.802908 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:46Z","lastTransitionTime":"2025-10-13T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.906505 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.906568 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.906588 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.906616 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:46 crc kubenswrapper[4974]: I1013 18:15:46.906635 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:46Z","lastTransitionTime":"2025-10-13T18:15:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.010708 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.010767 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.010798 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.010838 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.010864 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:47Z","lastTransitionTime":"2025-10-13T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.114135 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.114209 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.114227 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.114311 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.114332 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:47Z","lastTransitionTime":"2025-10-13T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.217761 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.217882 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.217902 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.217932 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.217951 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:47Z","lastTransitionTime":"2025-10-13T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.320992 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.321059 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.321082 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.321111 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.321133 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:47Z","lastTransitionTime":"2025-10-13T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.424560 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.424629 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.424689 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.424720 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.424753 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:47Z","lastTransitionTime":"2025-10-13T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.527927 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.527985 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.528002 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.528027 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.528045 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:47Z","lastTransitionTime":"2025-10-13T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.631113 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.631169 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.631185 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.631215 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.631236 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:47Z","lastTransitionTime":"2025-10-13T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.734597 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.734717 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.734742 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.734771 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.734795 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:47Z","lastTransitionTime":"2025-10-13T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.811121 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.811166 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.811214 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:47 crc kubenswrapper[4974]: E1013 18:15:47.811286 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.811310 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:47 crc kubenswrapper[4974]: E1013 18:15:47.811417 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:47 crc kubenswrapper[4974]: E1013 18:15:47.811491 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:47 crc kubenswrapper[4974]: E1013 18:15:47.811738 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.837603 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.837698 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.837726 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.837753 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.837778 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:47Z","lastTransitionTime":"2025-10-13T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.939717 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.939785 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.939812 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.939841 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:47 crc kubenswrapper[4974]: I1013 18:15:47.939862 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:47Z","lastTransitionTime":"2025-10-13T18:15:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.042571 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.042642 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.042694 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.042722 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.042740 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:48Z","lastTransitionTime":"2025-10-13T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.145835 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.145897 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.145914 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.145939 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.145959 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:48Z","lastTransitionTime":"2025-10-13T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.249301 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.249378 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.249405 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.249436 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.249458 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:48Z","lastTransitionTime":"2025-10-13T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.352395 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.352459 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.352475 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.352500 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.352518 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:48Z","lastTransitionTime":"2025-10-13T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.455531 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.455601 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.455619 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.455642 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.455685 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:48Z","lastTransitionTime":"2025-10-13T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.559056 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.559128 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.559169 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.559207 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.559254 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:48Z","lastTransitionTime":"2025-10-13T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.663048 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.663118 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.663141 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.663168 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.663191 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:48Z","lastTransitionTime":"2025-10-13T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.766877 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.766943 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.766966 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.766989 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.767005 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:48Z","lastTransitionTime":"2025-10-13T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.869577 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.869637 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.869701 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.869736 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.869759 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:48Z","lastTransitionTime":"2025-10-13T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.972369 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.972424 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.972450 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.972479 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:48 crc kubenswrapper[4974]: I1013 18:15:48.972501 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:48Z","lastTransitionTime":"2025-10-13T18:15:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.075950 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.076009 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.076030 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.076054 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.076072 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:49Z","lastTransitionTime":"2025-10-13T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.178184 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.178237 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.178250 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.178267 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.178278 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:49Z","lastTransitionTime":"2025-10-13T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.281183 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.281243 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.281255 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.281273 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.281286 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:49Z","lastTransitionTime":"2025-10-13T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.384475 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.384544 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.384568 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.384693 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.384742 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:49Z","lastTransitionTime":"2025-10-13T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.488068 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.488129 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.488153 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.488182 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.488205 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:49Z","lastTransitionTime":"2025-10-13T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.591630 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.591728 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.591764 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.591789 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.591806 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:49Z","lastTransitionTime":"2025-10-13T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.598375 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.598716 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.598634062 +0000 UTC m=+148.503000182 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.695064 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.695120 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.695137 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.695165 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.695190 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:49Z","lastTransitionTime":"2025-10-13T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.699823 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.699887 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.700307 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.700362 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.700144 4974 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.700515 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.700143 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.700545 4974 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.700589 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.700522151 +0000 UTC m=+148.604888341 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.700543 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.700631 4974 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.700597 4974 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.700695 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.700620924 +0000 UTC m=+148.604987074 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.700698 4974 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.700742 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.700717396 +0000 UTC m=+148.605083586 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.700777 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.700764838 +0000 UTC m=+148.605130948 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.798064 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.798116 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.798133 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.798154 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.798171 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:49Z","lastTransitionTime":"2025-10-13T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.811341 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.811393 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.811423 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.811566 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.811792 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.811937 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.812131 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:49 crc kubenswrapper[4974]: E1013 18:15:49.812229 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.901195 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.901271 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.901289 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.901314 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:49 crc kubenswrapper[4974]: I1013 18:15:49.901332 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:49Z","lastTransitionTime":"2025-10-13T18:15:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.004438 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.004501 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.004527 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.004551 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.004568 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:50Z","lastTransitionTime":"2025-10-13T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.107769 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.107830 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.107875 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.107901 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.107921 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:50Z","lastTransitionTime":"2025-10-13T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.211217 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.211409 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.211439 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.211478 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.211505 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:50Z","lastTransitionTime":"2025-10-13T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.314173 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.314236 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.314254 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.314277 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.314294 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:50Z","lastTransitionTime":"2025-10-13T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.418104 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.418154 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.418172 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.418195 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.418213 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:50Z","lastTransitionTime":"2025-10-13T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.521416 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.521483 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.521506 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.521538 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.521560 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:50Z","lastTransitionTime":"2025-10-13T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.624848 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.624902 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.624924 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.624952 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.624976 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:50Z","lastTransitionTime":"2025-10-13T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.728160 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.728230 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.728261 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.728288 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.728308 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:50Z","lastTransitionTime":"2025-10-13T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.831401 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.831485 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.831511 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.831538 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.831558 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:50Z","lastTransitionTime":"2025-10-13T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.934705 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.934791 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.934821 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.934857 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:50 crc kubenswrapper[4974]: I1013 18:15:50.934881 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:50Z","lastTransitionTime":"2025-10-13T18:15:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.037481 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.037602 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.037624 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.037683 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.037717 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:51Z","lastTransitionTime":"2025-10-13T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.140611 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.140704 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.140721 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.140746 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.140855 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:51Z","lastTransitionTime":"2025-10-13T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.244948 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.245307 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.245332 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.245365 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.245388 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:51Z","lastTransitionTime":"2025-10-13T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.348238 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.348310 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.348328 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.348353 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.348372 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:51Z","lastTransitionTime":"2025-10-13T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.451648 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.451773 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.451799 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.451827 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.451848 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:51Z","lastTransitionTime":"2025-10-13T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.555154 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.555369 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.555392 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.555417 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.555435 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:51Z","lastTransitionTime":"2025-10-13T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.659169 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.659245 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.659263 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.659292 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.659310 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:51Z","lastTransitionTime":"2025-10-13T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.763075 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.763115 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.763130 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.763154 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.763170 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:51Z","lastTransitionTime":"2025-10-13T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.810952 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.811037 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.810975 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.811262 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:51 crc kubenswrapper[4974]: E1013 18:15:51.811142 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:51 crc kubenswrapper[4974]: E1013 18:15:51.811471 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:51 crc kubenswrapper[4974]: E1013 18:15:51.811612 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:51 crc kubenswrapper[4974]: E1013 18:15:51.811805 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.867032 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.867084 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.867145 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.867200 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.867220 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:51Z","lastTransitionTime":"2025-10-13T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.970122 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.970206 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.970223 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.970247 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:51 crc kubenswrapper[4974]: I1013 18:15:51.970266 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:51Z","lastTransitionTime":"2025-10-13T18:15:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.073298 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.073373 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.073391 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.073416 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.073434 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:52Z","lastTransitionTime":"2025-10-13T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.177196 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.177254 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.177270 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.177293 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.177311 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:52Z","lastTransitionTime":"2025-10-13T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.280998 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.281062 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.281083 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.281110 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.281128 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:52Z","lastTransitionTime":"2025-10-13T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.384823 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.384891 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.384910 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.384934 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.384952 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:52Z","lastTransitionTime":"2025-10-13T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.488173 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.488227 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.488258 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.488298 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.488324 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:52Z","lastTransitionTime":"2025-10-13T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.591124 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.591200 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.591223 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.591251 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.591271 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:52Z","lastTransitionTime":"2025-10-13T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.694693 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.694757 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.694777 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.694801 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.694818 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:52Z","lastTransitionTime":"2025-10-13T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.798030 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.798092 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.798111 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.798134 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.798155 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:52Z","lastTransitionTime":"2025-10-13T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.901296 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.901372 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.901391 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.901418 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:52 crc kubenswrapper[4974]: I1013 18:15:52.901436 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:52Z","lastTransitionTime":"2025-10-13T18:15:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.013602 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.013711 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.013735 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.013769 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.013791 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.117186 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.117249 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.117267 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.117291 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.117310 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.220829 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.220899 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.220916 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.220942 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.220961 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.324142 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.324213 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.324235 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.324262 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.324282 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.427976 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.428038 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.428061 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.428093 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.428115 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.503515 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.503583 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.503600 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.503625 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.503642 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: E1013 18:15:53.524823 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.530468 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.530534 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.530552 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.530580 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.530600 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: E1013 18:15:53.551611 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.556296 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.556334 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.556347 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.556364 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.556376 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: E1013 18:15:53.576105 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.580679 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.580708 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.580718 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.580733 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.580744 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: E1013 18:15:53.594533 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.600810 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.600855 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.600866 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.600884 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.600896 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: E1013 18:15:53.619126 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:53Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:53 crc kubenswrapper[4974]: E1013 18:15:53.619344 4974 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.621361 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.621417 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.621435 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.621460 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.621479 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.724853 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.724931 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.724957 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.724989 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.725016 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.810622 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.810720 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.810739 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.810778 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:53 crc kubenswrapper[4974]: E1013 18:15:53.810863 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:53 crc kubenswrapper[4974]: E1013 18:15:53.810975 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:53 crc kubenswrapper[4974]: E1013 18:15:53.811139 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:53 crc kubenswrapper[4974]: E1013 18:15:53.811634 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.812048 4974 scope.go:117] "RemoveContainer" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" Oct 13 18:15:53 crc kubenswrapper[4974]: E1013 18:15:53.812273 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.827481 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.827537 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.827553 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.827622 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.827642 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.930800 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.930944 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.930965 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.930989 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:53 crc kubenswrapper[4974]: I1013 18:15:53.931007 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:53Z","lastTransitionTime":"2025-10-13T18:15:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.036962 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.037026 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.037050 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.037081 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.037103 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:54Z","lastTransitionTime":"2025-10-13T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.140149 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.140204 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.140226 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.140259 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.140281 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:54Z","lastTransitionTime":"2025-10-13T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.243627 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.243751 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.243776 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.243798 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.243816 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:54Z","lastTransitionTime":"2025-10-13T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.346981 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.347179 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.347207 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.347234 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.347257 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:54Z","lastTransitionTime":"2025-10-13T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.451051 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.451253 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.451283 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.451363 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.451386 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:54Z","lastTransitionTime":"2025-10-13T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.554124 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.554192 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.554210 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.554237 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.554253 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:54Z","lastTransitionTime":"2025-10-13T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.656746 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.656820 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.656843 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.656874 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.656893 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:54Z","lastTransitionTime":"2025-10-13T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.760605 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.760692 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.760712 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.760735 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.760755 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:54Z","lastTransitionTime":"2025-10-13T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.863470 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.863533 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.863550 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.863576 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.863594 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:54Z","lastTransitionTime":"2025-10-13T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.965908 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.965999 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.966025 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.966061 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:54 crc kubenswrapper[4974]: I1013 18:15:54.966087 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:54Z","lastTransitionTime":"2025-10-13T18:15:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.068702 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.068774 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.068793 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.068821 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.068841 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:55Z","lastTransitionTime":"2025-10-13T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.172032 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.172099 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.172117 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.172145 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.172165 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:55Z","lastTransitionTime":"2025-10-13T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.275411 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.275466 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.275484 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.275508 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.275527 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:55Z","lastTransitionTime":"2025-10-13T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.378084 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.378181 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.378201 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.378225 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.378243 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:55Z","lastTransitionTime":"2025-10-13T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.481311 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.481361 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.481378 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.481401 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.481418 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:55Z","lastTransitionTime":"2025-10-13T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.584414 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.584516 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.584541 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.584612 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.584639 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:55Z","lastTransitionTime":"2025-10-13T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.687953 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.688030 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.688056 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.688086 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.688112 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:55Z","lastTransitionTime":"2025-10-13T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.791543 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.791610 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.791630 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.791685 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.791705 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:55Z","lastTransitionTime":"2025-10-13T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.811367 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.811618 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.811649 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.812104 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:55 crc kubenswrapper[4974]: E1013 18:15:55.812126 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:55 crc kubenswrapper[4974]: E1013 18:15:55.811974 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:55 crc kubenswrapper[4974]: E1013 18:15:55.812256 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:55 crc kubenswrapper[4974]: E1013 18:15:55.812366 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.835284 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.854836 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.875495 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"2025-10-13T18:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706\\\\n2025-10-13T18:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706 to /host/opt/cni/bin/\\\\n2025-10-13T18:14:50Z [verbose] multus-daemon started\\\\n2025-10-13T18:14:50Z [verbose] Readiness Indicator file check\\\\n2025-10-13T18:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.890313 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.895063 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.895125 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.895148 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.895179 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.895200 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:55Z","lastTransitionTime":"2025-10-13T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.904479 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.937351 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:42Z\\\",\\\"message\\\":\\\"s/externalversions/factory.go:140\\\\nI1013 18:15:42.028539 7039 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1013 18:15:42.029006 7039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:15:42.029033 7039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:15:42.029082 7039 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:15:42.029083 7039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:15:42.029108 7039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:15:42.029127 7039 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:15:42.029135 7039 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:15:42.029155 7039 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:15:42.029166 7039 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:15:42.029202 7039 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:15:42.029219 7039 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:15:42.029256 7039 factory.go:656] Stopping watch factory\\\\nI1013 18:15:42.029276 7039 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.954339 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec4e67d-0eda-4b41-90f7-ef0371f5ec49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c76aeb1156f079b7cbc4c0553eeea4cf5929dc0aa0ab6e6010a9f4588ecb6f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd05f0e33255ca34350d583298cdb6e4cef6573c67613d31cf53f60d2cae2792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44540d6f9c2317f4194ee49126bed6b9cc86346979eddff48829c63853b131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.970524 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.988878 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:55Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.998065 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.998100 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.998111 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.998130 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:55 crc kubenswrapper[4974]: I1013 18:15:55.998142 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:55Z","lastTransitionTime":"2025-10-13T18:15:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.004915 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.022233 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.040222 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.067304 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.085713 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.101317 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.101396 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.101423 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.101455 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.101480 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:56Z","lastTransitionTime":"2025-10-13T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.105969 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.126126 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.146642 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.166294 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:15:56Z is after 2025-08-24T17:21:41Z" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.205239 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.205300 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.205319 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.205343 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.205367 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:56Z","lastTransitionTime":"2025-10-13T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.308956 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.309035 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.309063 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.309094 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.309112 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:56Z","lastTransitionTime":"2025-10-13T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.412562 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.412607 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.412618 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.412636 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.412647 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:56Z","lastTransitionTime":"2025-10-13T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.515495 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.515582 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.515606 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.515637 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.515742 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:56Z","lastTransitionTime":"2025-10-13T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.619136 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.619199 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.619222 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.619249 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.619266 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:56Z","lastTransitionTime":"2025-10-13T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.722027 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.722079 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.722091 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.722112 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.722124 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:56Z","lastTransitionTime":"2025-10-13T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.825302 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.825382 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.825411 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.825438 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.825455 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:56Z","lastTransitionTime":"2025-10-13T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.928372 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.928417 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.928428 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.928442 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:56 crc kubenswrapper[4974]: I1013 18:15:56.928452 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:56Z","lastTransitionTime":"2025-10-13T18:15:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.030104 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.030141 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.030150 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.030163 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.030172 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:57Z","lastTransitionTime":"2025-10-13T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.132392 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.132464 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.132481 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.132510 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.132528 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:57Z","lastTransitionTime":"2025-10-13T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.235896 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.235969 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.235988 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.236015 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.236036 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:57Z","lastTransitionTime":"2025-10-13T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.339234 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.339329 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.339347 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.339403 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.339422 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:57Z","lastTransitionTime":"2025-10-13T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.442428 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.442534 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.442560 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.442591 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.442612 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:57Z","lastTransitionTime":"2025-10-13T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.546164 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.546247 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.546269 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.546299 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.546318 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:57Z","lastTransitionTime":"2025-10-13T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.649435 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.649493 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.649518 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.649551 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.649574 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:57Z","lastTransitionTime":"2025-10-13T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.752459 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.752518 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.752535 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.752560 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.752578 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:57Z","lastTransitionTime":"2025-10-13T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.811458 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.811516 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.811563 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:57 crc kubenswrapper[4974]: E1013 18:15:57.811735 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.811773 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:57 crc kubenswrapper[4974]: E1013 18:15:57.811937 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:57 crc kubenswrapper[4974]: E1013 18:15:57.812163 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:57 crc kubenswrapper[4974]: E1013 18:15:57.812563 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.856212 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.856291 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.856310 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.856334 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.856352 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:57Z","lastTransitionTime":"2025-10-13T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.959477 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.959567 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.959591 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.959625 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:57 crc kubenswrapper[4974]: I1013 18:15:57.959648 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:57Z","lastTransitionTime":"2025-10-13T18:15:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.062901 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.062947 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.062963 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.062986 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.063005 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:58Z","lastTransitionTime":"2025-10-13T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.165716 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.165762 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.165771 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.165787 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.165797 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:58Z","lastTransitionTime":"2025-10-13T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.269044 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.269102 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.269119 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.269142 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.269159 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:58Z","lastTransitionTime":"2025-10-13T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.373594 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.373719 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.373738 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.373840 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.373860 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:58Z","lastTransitionTime":"2025-10-13T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.477190 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.477259 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.477282 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.477309 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.477330 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:58Z","lastTransitionTime":"2025-10-13T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.581497 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.581573 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.581609 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.581647 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.581714 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:58Z","lastTransitionTime":"2025-10-13T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.684394 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.684465 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.684486 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.684515 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.684537 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:58Z","lastTransitionTime":"2025-10-13T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.792174 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.792240 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.792276 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.792312 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.792335 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:58Z","lastTransitionTime":"2025-10-13T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.827916 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.896141 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.896223 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.896242 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.896293 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.896310 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:58Z","lastTransitionTime":"2025-10-13T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.999630 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.999716 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.999733 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.999756 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:58 crc kubenswrapper[4974]: I1013 18:15:58.999773 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:58Z","lastTransitionTime":"2025-10-13T18:15:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.102786 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.102843 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.102890 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.102914 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.102931 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:59Z","lastTransitionTime":"2025-10-13T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.205350 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.205394 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.205451 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.205473 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.205489 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:59Z","lastTransitionTime":"2025-10-13T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.309451 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.309525 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.309542 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.309567 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.309586 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:59Z","lastTransitionTime":"2025-10-13T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.412642 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.412739 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.412757 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.412779 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.412796 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:59Z","lastTransitionTime":"2025-10-13T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.515524 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.515602 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.515622 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.515719 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.515749 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:59Z","lastTransitionTime":"2025-10-13T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.618809 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.618923 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.618947 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.619372 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.619602 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:59Z","lastTransitionTime":"2025-10-13T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.723023 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.723087 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.723105 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.723130 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.723148 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:59Z","lastTransitionTime":"2025-10-13T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.811369 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.811476 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.811399 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.811386 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:15:59 crc kubenswrapper[4974]: E1013 18:15:59.811591 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:15:59 crc kubenswrapper[4974]: E1013 18:15:59.811783 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:15:59 crc kubenswrapper[4974]: E1013 18:15:59.811882 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:15:59 crc kubenswrapper[4974]: E1013 18:15:59.812149 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.825879 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.825930 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.825946 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.825968 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.825989 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:59Z","lastTransitionTime":"2025-10-13T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.928864 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.928930 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.928949 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.928978 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:15:59 crc kubenswrapper[4974]: I1013 18:15:59.928998 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:15:59Z","lastTransitionTime":"2025-10-13T18:15:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.031679 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.031745 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.031762 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.031789 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.031808 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:00Z","lastTransitionTime":"2025-10-13T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.134744 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.134810 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.134833 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.134864 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.134886 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:00Z","lastTransitionTime":"2025-10-13T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.238087 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.238167 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.238187 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.238213 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.238231 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:00Z","lastTransitionTime":"2025-10-13T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.341780 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.341834 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.341851 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.341875 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.341892 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:00Z","lastTransitionTime":"2025-10-13T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.445645 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.445728 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.445748 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.445777 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.445799 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:00Z","lastTransitionTime":"2025-10-13T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.550780 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.550855 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.550879 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.550907 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.550930 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:00Z","lastTransitionTime":"2025-10-13T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.653252 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.653308 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.653324 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.653351 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.653368 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:00Z","lastTransitionTime":"2025-10-13T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.756416 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.756483 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.756502 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.756527 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.756544 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:00Z","lastTransitionTime":"2025-10-13T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.859518 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.859610 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.859630 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.859692 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.859711 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:00Z","lastTransitionTime":"2025-10-13T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.962335 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.962410 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.962429 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.962458 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:00 crc kubenswrapper[4974]: I1013 18:16:00.962476 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:00Z","lastTransitionTime":"2025-10-13T18:16:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.066327 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.066386 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.066406 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.066432 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.066450 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:01Z","lastTransitionTime":"2025-10-13T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.168051 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.168102 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.168115 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.168132 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.168144 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:01Z","lastTransitionTime":"2025-10-13T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.270685 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.270751 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.270778 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.270812 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.270839 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:01Z","lastTransitionTime":"2025-10-13T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.372916 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.372957 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.372965 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.372980 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.372990 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:01Z","lastTransitionTime":"2025-10-13T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.476544 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.476616 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.476639 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.476711 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.476730 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:01Z","lastTransitionTime":"2025-10-13T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.580149 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.580216 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.580241 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.580273 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.580294 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:01Z","lastTransitionTime":"2025-10-13T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.683555 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.683603 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.683619 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.683639 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.683695 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:01Z","lastTransitionTime":"2025-10-13T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.786596 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.786718 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.786741 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.786771 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.786793 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:01Z","lastTransitionTime":"2025-10-13T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.811320 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.811416 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:01 crc kubenswrapper[4974]: E1013 18:16:01.811493 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.811608 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:01 crc kubenswrapper[4974]: E1013 18:16:01.811859 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.811889 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:01 crc kubenswrapper[4974]: E1013 18:16:01.812009 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:01 crc kubenswrapper[4974]: E1013 18:16:01.812118 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.890699 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.890751 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.890767 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.890788 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.890806 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:01Z","lastTransitionTime":"2025-10-13T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.993703 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.993761 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.993782 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.993813 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:01 crc kubenswrapper[4974]: I1013 18:16:01.993836 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:01Z","lastTransitionTime":"2025-10-13T18:16:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.096876 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.096943 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.096962 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.096991 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.097010 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:02Z","lastTransitionTime":"2025-10-13T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.199883 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.199945 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.199962 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.199987 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.200004 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:02Z","lastTransitionTime":"2025-10-13T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.303182 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.303239 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.303256 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.303281 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.303299 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:02Z","lastTransitionTime":"2025-10-13T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.406724 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.406801 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.406825 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.406854 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.406873 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:02Z","lastTransitionTime":"2025-10-13T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.509625 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.509753 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.509779 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.509811 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.509834 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:02Z","lastTransitionTime":"2025-10-13T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.612550 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.612604 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.612623 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.612646 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.612692 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:02Z","lastTransitionTime":"2025-10-13T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.715542 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.715627 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.715697 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.715733 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.715757 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:02Z","lastTransitionTime":"2025-10-13T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.818822 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.818893 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.818916 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.818947 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.818971 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:02Z","lastTransitionTime":"2025-10-13T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.922413 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.922456 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.922471 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.922492 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:02 crc kubenswrapper[4974]: I1013 18:16:02.922509 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:02Z","lastTransitionTime":"2025-10-13T18:16:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.025208 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.025262 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.025280 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.025303 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.025321 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.163534 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.163601 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.163625 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.163686 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.163712 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.266438 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.266489 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.266508 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.266535 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.266554 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.369277 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.369319 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.369331 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.369352 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.369378 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.472468 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.472524 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.472540 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.472561 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.472579 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.576403 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.576451 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.576489 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.576511 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.576524 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.680447 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.680573 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.680639 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.680711 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.680736 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.784879 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.784930 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.784941 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.784959 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.784973 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.788271 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.788318 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.788334 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.788357 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.788374 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: E1013 18:16:03.808928 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:03Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.810895 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:03 crc kubenswrapper[4974]: E1013 18:16:03.811092 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.811900 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:03 crc kubenswrapper[4974]: E1013 18:16:03.812036 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.812728 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:03 crc kubenswrapper[4974]: E1013 18:16:03.812871 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.813078 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:03 crc kubenswrapper[4974]: E1013 18:16:03.813490 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.815298 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.815335 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.815352 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.815372 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.815388 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: E1013 18:16:03.835764 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:03Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.841201 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.841250 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.841267 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.841290 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.841307 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: E1013 18:16:03.861723 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:03Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.866827 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.866910 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.866928 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.866952 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.866971 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: E1013 18:16:03.887102 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:03Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.892061 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.892111 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.892127 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.892152 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.892168 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:03 crc kubenswrapper[4974]: E1013 18:16:03.912570 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:03Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:03 crc kubenswrapper[4974]: E1013 18:16:03.912902 4974 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.915931 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.916014 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.916040 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.916073 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:03 crc kubenswrapper[4974]: I1013 18:16:03.916110 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:03Z","lastTransitionTime":"2025-10-13T18:16:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.019261 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.019330 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.019349 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.019375 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.019393 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:04Z","lastTransitionTime":"2025-10-13T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.122590 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.122646 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.122692 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.122716 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.122733 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:04Z","lastTransitionTime":"2025-10-13T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.226192 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.226234 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.226248 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.226264 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.226274 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:04Z","lastTransitionTime":"2025-10-13T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.329232 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.329281 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.329305 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.329332 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.329353 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:04Z","lastTransitionTime":"2025-10-13T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.431986 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.432045 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.432068 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.432099 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.432119 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:04Z","lastTransitionTime":"2025-10-13T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.535445 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.535505 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.535734 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.535780 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.535806 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:04Z","lastTransitionTime":"2025-10-13T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.638844 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.638904 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.638926 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.638956 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.638978 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:04Z","lastTransitionTime":"2025-10-13T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.742136 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.742196 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.742216 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.742241 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.742259 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:04Z","lastTransitionTime":"2025-10-13T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.845619 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.845692 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.845715 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.845740 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.845760 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:04Z","lastTransitionTime":"2025-10-13T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.950234 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.950283 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.950305 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.950354 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:04 crc kubenswrapper[4974]: I1013 18:16:04.950391 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:04Z","lastTransitionTime":"2025-10-13T18:16:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.053693 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.053769 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.053789 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.053816 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.053835 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:05Z","lastTransitionTime":"2025-10-13T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.156564 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.156727 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.156761 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.156784 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.156801 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:05Z","lastTransitionTime":"2025-10-13T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.259964 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.260029 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.260052 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.260080 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.260103 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:05Z","lastTransitionTime":"2025-10-13T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.363140 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.363186 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.363202 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.363221 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.363236 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:05Z","lastTransitionTime":"2025-10-13T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.465981 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.466056 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.466079 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.466108 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.466131 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:05Z","lastTransitionTime":"2025-10-13T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.568755 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.568818 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.568836 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.568867 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.568884 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:05Z","lastTransitionTime":"2025-10-13T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.672168 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.672566 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.672756 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.672907 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.673072 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:05Z","lastTransitionTime":"2025-10-13T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.777054 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.777128 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.777147 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.777172 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.777244 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:05Z","lastTransitionTime":"2025-10-13T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.811441 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.811722 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:05 crc kubenswrapper[4974]: E1013 18:16:05.811855 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:05 crc kubenswrapper[4974]: E1013 18:16:05.811890 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.812002 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:05 crc kubenswrapper[4974]: E1013 18:16:05.812210 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.812013 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:05 crc kubenswrapper[4974]: E1013 18:16:05.812386 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.828526 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-r5fj5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cddd1702-fcd5-40f0-97fc-61eb59192de8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe057ca54acd4b6a2120c7ea3679537d0888afdda9baffa80ed90011b0ca1164\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-27gl5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-r5fj5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.846255 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"744ca489-ade8-41c6-94da-0b2d51a9ca6b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aa6c942fdc1278efc771eac7ed2f4ed4bf07c2b79289a85c7142a06ca068e44b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://11bfc0f6f29d9072997ce3645c877f6b627563f8ae5d2b55ff13b957b69fec67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nc2ds\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-rt4jb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.861771 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a260247c-2399-42b5-bddc-73e38659680b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7spln\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:15:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-z9hj4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.878042 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"845866c9-e578-4071-bd4e-c4eb1014fe58\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bff0fea41490697624b947252e1babd980715a0a78dac154faa90511067e736b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66c168a412844860c141c82e64731411223f40ddf9ea57e2bf812d02216b44e9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66c168a412844860c141c82e64731411223f40ddf9ea57e2bf812d02216b44e9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.880779 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.880858 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.880884 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.880914 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.880936 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:05Z","lastTransitionTime":"2025-10-13T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.916465 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f6936b43-9bbd-4e18-a341-7e7bec67792b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c96434877acd54a68ea601b12b22b31392ea7cdefcaba460886d9546d5909d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://01fd5618f813b613b587226e57f23d1ef4fd12fdf85dd436b1bbc45ff1449f2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0103a86a53a42aaf0ac4e9c5b08499bcd2b678ab9cd8d56b9239a4c16084323\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cc6eac23b737f1af062878c3fe4eba50e5ccbcb271c7c64a6ea81b8edae72a95\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://62ceca29d4f0328c4c70528faef3b244373898ae238a89e287d2f5c22918d9fc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://592edd946c80c6ff4ed94b731ef47ef40019ecaa3119ba34c93de861ebb20b9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ce6efdb18e83056c3834ddfc76b186c37564b922d61f74172178eb58e3a7e319\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1298b7450c3175ec6a143a291b15360c56e8c9179f2f61c50fd57e59100098a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.937339 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.959565 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05ac96ec-aee9-4f1d-868c-6f2252c021bb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79843b0b77d488e7a8910513196135d2efd9492cd5e6c62fa0fd1afe7dce3030\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://55f73e1ffdacca226c71e5f333967ae60f3b41690b2d9ff8f2d0779484a97ce6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6870c201326875252dbc92c6fecd1ed747b8e95620f47824da2f9b2e4817adf6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://83633a2331d5859d909633833319563cdebd783ce0e93b53c1dc8668013b4de3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e466faaed6bb10a1a42b75d21fd5fe1c4371c3474bfa02ca343f8df324c1ebcd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6da819a5c27a753fe52b763861e0665cbca71f16d5d323ed76f359333e779c97\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed61fa94d20dde7ef0eb39cefc5b0f055b78a60fd7d55e640f71d45dbe227e9f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ccpr7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gwv4w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.977047 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"720a1014-066f-4689-9904-1e388f63caff\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e87aa4897e838ccccf57a64b45d0d762f4f2017200b9954b1bbbc8b21d6fe9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e0e33d90cdca198e5d0798032d02314b7853758e1da48129c33eadf28d00a93\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac6611eecfbb24554d1f91015d899c604a067988f785db7c417a983d5d6b759d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://663316f65529cb8a56f374a65d9b289c0c0c6db78d36f7f1e420389bf2b603c4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.983272 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.983332 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.983346 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.983389 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.983404 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:05Z","lastTransitionTime":"2025-10-13T18:16:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.993335 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:05 crc kubenswrapper[4974]: E1013 18:16:05.993692 4974 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:16:05 crc kubenswrapper[4974]: E1013 18:16:05.993799 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs podName:a260247c-2399-42b5-bddc-73e38659680b nodeName:}" failed. No retries permitted until 2025-10-13 18:17:09.993766791 +0000 UTC m=+164.898132911 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs") pod "network-metrics-daemon-z9hj4" (UID: "a260247c-2399-42b5-bddc-73e38659680b") : object "openshift-multus"/"metrics-daemon-secret" not registered Oct 13 18:16:05 crc kubenswrapper[4974]: I1013 18:16:05.993677 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://99d99d4865efa04e0a8cc8bab3210f51cfa5286fb1869836f31c619e6fbfabd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:05Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.011323 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88ac97731851fc9aea36bb7388af2bfa69cbd6c43850d8131bd5111ae859db16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://580cf4da8adacbc5f60418c9ac359a5a57f585fa4aa2f9920f74bf60487d9ff6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.030927 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b002cc44-835c-436d-b330-a6b0401cc065\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8dc54f4f1d6305e5a93110850bdea828e1788b7d7ccd6f6510ae5fc46812bebf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e2a6e9e2afd8b27b4fa9b2ddb2a10b1e60ac9fdb1a29f486667942b1cdf1442\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://717de2889fd5c6912146e531a65a860c0f36f10370e38d63504f09b247b375b9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8527fd7ced697fe1766f1ea654acec61ae026b869812d36555409218edde89bd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f84e64ac73cf877da0ccd1fc72650cf9d029ba934c5cf444a651000dc1da472c\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1013 18:14:39.676264 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1013 18:14:39.678201 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2418768835/tls.crt::/tmp/serving-cert-2418768835/tls.key\\\\\\\"\\\\nI1013 18:14:45.631557 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1013 18:14:45.636000 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1013 18:14:45.636043 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1013 18:14:45.636089 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1013 18:14:45.636113 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1013 18:14:45.648016 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1013 18:14:45.648061 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1013 18:14:45.648070 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648088 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1013 18:14:45.648101 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1013 18:14:45.648110 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1013 18:14:45.648119 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1013 18:14:45.648128 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1013 18:14:45.657417 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:29Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0f7f43fbcc0ea442ca24937e698944150c02f13d23ae1f569ec1e4e1a6153e64\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://67d5be35187b1fae8b7903667e79377ecd39e17d6a908b268179a2e45d9df94c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.044718 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.062746 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xcspx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:35Z\\\",\\\"message\\\":\\\"2025-10-13T18:14:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706\\\\n2025-10-13T18:14:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_93252ecb-54a8-4b99-9bd9-5d7832e7b706 to /host/opt/cni/bin/\\\\n2025-10-13T18:14:50Z [verbose] multus-daemon started\\\\n2025-10-13T18:14:50Z [verbose] Readiness Indicator file check\\\\n2025-10-13T18:15:35Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:15:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nw9b5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xcspx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.082537 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"013d968f-6cef-476b-a6fc-88d396bd5cd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://547a2cd84f086127dff7e5f5a7e398eb3d8f727442d92b84f73de52e49bec230\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-klxdc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-xpb6b\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.087306 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.087368 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.087388 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.087421 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.087445 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:06Z","lastTransitionTime":"2025-10-13T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.100698 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-98z75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6b2a80de-225e-4b5a-93fa-a05e3524db4e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3720dae7e62c85b0c936d28a5e3d05fe6ce53ea8dcd9ee92dbb0201f3229d8f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z5zsg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-98z75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.133377 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9f54cc7-5b3b-4481-9be5-f03df1854435\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-10-13T18:15:42Z\\\",\\\"message\\\":\\\"s/externalversions/factory.go:140\\\\nI1013 18:15:42.028539 7039 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1013 18:15:42.029006 7039 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1013 18:15:42.029033 7039 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1013 18:15:42.029082 7039 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1013 18:15:42.029083 7039 handler.go:208] Removed *v1.Node event handler 2\\\\nI1013 18:15:42.029108 7039 handler.go:208] Removed *v1.Node event handler 7\\\\nI1013 18:15:42.029127 7039 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1013 18:15:42.029135 7039 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1013 18:15:42.029155 7039 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1013 18:15:42.029166 7039 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1013 18:15:42.029202 7039 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1013 18:15:42.029219 7039 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1013 18:15:42.029256 7039 factory.go:656] Stopping watch factory\\\\nI1013 18:15:42.029276 7039 ovnkube.go:599] Stopped ovnkube\\\\nI1013 18\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-10-13T18:15:40Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s8nch\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-zwcs8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.153286 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ec4e67d-0eda-4b41-90f7-ef0371f5ec49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:15:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c76aeb1156f079b7cbc4c0553eeea4cf5929dc0aa0ab6e6010a9f4588ecb6f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd05f0e33255ca34350d583298cdb6e4cef6573c67613d31cf53f60d2cae2792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c44540d6f9c2317f4194ee49126bed6b9cc86346979eddff48829c63853b131a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://86fc79cfd0646e22e655563f09e78a43265e16a4737ecd229a0058b0f435fe16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-10-13T18:14:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-10-13T18:14:26Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-10-13T18:14:25Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.172723 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:48Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b8697bc1fb5728b2afb4585390ccd1e3ad01d3c9c845a0d0d46a8521a9b0446\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-10-13T18:14:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.191861 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.191911 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.191928 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.191954 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.191974 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:06Z","lastTransitionTime":"2025-10-13T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.192899 4974 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-10-13T18:14:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:06Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.295200 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.295268 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.295288 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.295314 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.295332 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:06Z","lastTransitionTime":"2025-10-13T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.398428 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.398486 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.398504 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.398529 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.398546 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:06Z","lastTransitionTime":"2025-10-13T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.503269 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.503326 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.503342 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.503368 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.503386 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:06Z","lastTransitionTime":"2025-10-13T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.606835 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.606905 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.606928 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.606977 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.607002 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:06Z","lastTransitionTime":"2025-10-13T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.710308 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.710409 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.710451 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.710484 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.710506 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:06Z","lastTransitionTime":"2025-10-13T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.812330 4974 scope.go:117] "RemoveContainer" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" Oct 13 18:16:06 crc kubenswrapper[4974]: E1013 18:16:06.812727 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.813375 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.813428 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.813451 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.813480 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.813503 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:06Z","lastTransitionTime":"2025-10-13T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.920947 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.921005 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.921016 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.921037 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:06 crc kubenswrapper[4974]: I1013 18:16:06.921049 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:06Z","lastTransitionTime":"2025-10-13T18:16:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.024950 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.025007 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.025018 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.025038 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.025052 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:07Z","lastTransitionTime":"2025-10-13T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.127692 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.127776 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.127803 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.127842 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.127862 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:07Z","lastTransitionTime":"2025-10-13T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.231840 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.231903 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.231916 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.231939 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.231952 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:07Z","lastTransitionTime":"2025-10-13T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.335812 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.335879 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.335919 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.335956 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.335982 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:07Z","lastTransitionTime":"2025-10-13T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.438912 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.438983 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.439001 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.439027 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.439048 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:07Z","lastTransitionTime":"2025-10-13T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.542571 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.542694 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.542724 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.542759 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.542783 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:07Z","lastTransitionTime":"2025-10-13T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.645333 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.645423 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.645443 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.645501 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.645517 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:07Z","lastTransitionTime":"2025-10-13T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.748060 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.748169 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.748193 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.748269 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.748289 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:07Z","lastTransitionTime":"2025-10-13T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.811207 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.811285 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:07 crc kubenswrapper[4974]: E1013 18:16:07.811398 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:07 crc kubenswrapper[4974]: E1013 18:16:07.811559 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.811764 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.811778 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:07 crc kubenswrapper[4974]: E1013 18:16:07.811910 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:07 crc kubenswrapper[4974]: E1013 18:16:07.811999 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.851475 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.851548 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.851571 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.851600 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.851623 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:07Z","lastTransitionTime":"2025-10-13T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.954840 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.954924 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.954955 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.954985 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:07 crc kubenswrapper[4974]: I1013 18:16:07.955007 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:07Z","lastTransitionTime":"2025-10-13T18:16:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.059166 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.059248 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.059268 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.059292 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.059315 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:08Z","lastTransitionTime":"2025-10-13T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.162864 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.162923 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.162941 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.162966 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.162986 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:08Z","lastTransitionTime":"2025-10-13T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.267137 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.267511 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.267540 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.267572 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.267595 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:08Z","lastTransitionTime":"2025-10-13T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.371701 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.371774 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.371799 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.371838 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.371864 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:08Z","lastTransitionTime":"2025-10-13T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.474824 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.474912 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.474931 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.474962 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.474984 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:08Z","lastTransitionTime":"2025-10-13T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.578029 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.578100 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.578120 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.578145 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.578164 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:08Z","lastTransitionTime":"2025-10-13T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.681462 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.681503 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.681513 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.681529 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.681540 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:08Z","lastTransitionTime":"2025-10-13T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.785969 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.786051 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.786089 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.786128 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.786153 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:08Z","lastTransitionTime":"2025-10-13T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.889386 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.889440 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.889458 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.889482 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.889500 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:08Z","lastTransitionTime":"2025-10-13T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.992619 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.992744 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.992772 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.992807 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:08 crc kubenswrapper[4974]: I1013 18:16:08.992831 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:08Z","lastTransitionTime":"2025-10-13T18:16:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.095848 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.095899 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.095910 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.095931 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.095943 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:09Z","lastTransitionTime":"2025-10-13T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.199773 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.199865 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.199888 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.200104 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.200131 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:09Z","lastTransitionTime":"2025-10-13T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.302427 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.302480 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.302494 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.302515 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.302531 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:09Z","lastTransitionTime":"2025-10-13T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.405072 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.405132 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.405154 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.405183 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.405205 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:09Z","lastTransitionTime":"2025-10-13T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.508768 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.508824 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.508841 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.508864 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.508882 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:09Z","lastTransitionTime":"2025-10-13T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.612245 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.612389 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.612457 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.612480 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.612546 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:09Z","lastTransitionTime":"2025-10-13T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.715252 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.715308 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.715325 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.715350 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.715370 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:09Z","lastTransitionTime":"2025-10-13T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.810772 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.810818 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.810971 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:09 crc kubenswrapper[4974]: E1013 18:16:09.811209 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.811487 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:09 crc kubenswrapper[4974]: E1013 18:16:09.811639 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:09 crc kubenswrapper[4974]: E1013 18:16:09.811866 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:09 crc kubenswrapper[4974]: E1013 18:16:09.812052 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.817902 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.817944 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.817959 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.817980 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.817996 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:09Z","lastTransitionTime":"2025-10-13T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.921376 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.921458 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.921485 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.921518 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:09 crc kubenswrapper[4974]: I1013 18:16:09.921540 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:09Z","lastTransitionTime":"2025-10-13T18:16:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.025462 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.025563 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.025585 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.025650 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.025737 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:10Z","lastTransitionTime":"2025-10-13T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.128755 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.128818 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.128838 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.128865 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.128883 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:10Z","lastTransitionTime":"2025-10-13T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.232120 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.232185 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.232202 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.232229 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.232247 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:10Z","lastTransitionTime":"2025-10-13T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.336074 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.336151 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.336173 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.336227 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.336260 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:10Z","lastTransitionTime":"2025-10-13T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.438543 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.438602 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.438618 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.438642 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.438699 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:10Z","lastTransitionTime":"2025-10-13T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.542164 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.542229 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.542246 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.542276 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.542301 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:10Z","lastTransitionTime":"2025-10-13T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.645276 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.645399 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.645424 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.645456 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.645476 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:10Z","lastTransitionTime":"2025-10-13T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.749283 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.749359 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.749371 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.749403 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.749419 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:10Z","lastTransitionTime":"2025-10-13T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.852318 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.852376 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.852395 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.852419 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.852439 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:10Z","lastTransitionTime":"2025-10-13T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.955355 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.955419 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.955436 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.955466 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:10 crc kubenswrapper[4974]: I1013 18:16:10.955484 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:10Z","lastTransitionTime":"2025-10-13T18:16:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.060727 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.060777 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.060795 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.060818 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.060835 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:11Z","lastTransitionTime":"2025-10-13T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.165001 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.165065 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.165082 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.165110 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.165130 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:11Z","lastTransitionTime":"2025-10-13T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.268147 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.268210 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.268231 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.268257 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.268276 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:11Z","lastTransitionTime":"2025-10-13T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.371779 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.371841 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.371859 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.371889 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.371906 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:11Z","lastTransitionTime":"2025-10-13T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.475928 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.476001 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.476021 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.476049 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.476070 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:11Z","lastTransitionTime":"2025-10-13T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.579257 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.579315 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.579332 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.579355 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.579375 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:11Z","lastTransitionTime":"2025-10-13T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.682271 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.682312 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.682323 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.682341 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.682353 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:11Z","lastTransitionTime":"2025-10-13T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.785415 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.785489 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.785508 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.785533 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.785551 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:11Z","lastTransitionTime":"2025-10-13T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.811222 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.811305 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.811225 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:11 crc kubenswrapper[4974]: E1013 18:16:11.811498 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.811244 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:11 crc kubenswrapper[4974]: E1013 18:16:11.811624 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:11 crc kubenswrapper[4974]: E1013 18:16:11.811396 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:11 crc kubenswrapper[4974]: E1013 18:16:11.811764 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.889475 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.889539 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.889573 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.889604 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.889624 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:11Z","lastTransitionTime":"2025-10-13T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.993109 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.993167 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.993179 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.993200 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:11 crc kubenswrapper[4974]: I1013 18:16:11.993214 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:11Z","lastTransitionTime":"2025-10-13T18:16:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.096276 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.096347 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.096370 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.096399 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.096419 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:12Z","lastTransitionTime":"2025-10-13T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.200562 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.200716 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.201247 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.201346 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.201614 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:12Z","lastTransitionTime":"2025-10-13T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.305826 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.305908 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.305927 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.305957 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.305975 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:12Z","lastTransitionTime":"2025-10-13T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.408583 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.408707 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.408741 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.408775 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.408798 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:12Z","lastTransitionTime":"2025-10-13T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.512126 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.512192 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.512211 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.512239 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.512263 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:12Z","lastTransitionTime":"2025-10-13T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.615725 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.615804 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.615837 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.615870 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.615890 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:12Z","lastTransitionTime":"2025-10-13T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.719199 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.719261 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.719279 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.719305 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.719323 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:12Z","lastTransitionTime":"2025-10-13T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.822130 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.822212 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.822237 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.822274 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.822300 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:12Z","lastTransitionTime":"2025-10-13T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.925105 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.925175 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.925195 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.925220 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:12 crc kubenswrapper[4974]: I1013 18:16:12.925241 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:12Z","lastTransitionTime":"2025-10-13T18:16:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.029344 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.029433 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.029458 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.029494 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.029518 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:13Z","lastTransitionTime":"2025-10-13T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.132544 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.132639 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.132700 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.132729 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.132747 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:13Z","lastTransitionTime":"2025-10-13T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.235457 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.235560 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.235577 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.235598 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.235616 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:13Z","lastTransitionTime":"2025-10-13T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.339308 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.339368 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.339386 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.339414 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.339431 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:13Z","lastTransitionTime":"2025-10-13T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.441569 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.441623 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.441641 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.441693 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.441711 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:13Z","lastTransitionTime":"2025-10-13T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.544827 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.544885 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.544903 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.544927 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.544945 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:13Z","lastTransitionTime":"2025-10-13T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.647836 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.647908 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.647927 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.647953 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.647972 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:13Z","lastTransitionTime":"2025-10-13T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.751307 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.751359 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.751376 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.751400 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.751417 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:13Z","lastTransitionTime":"2025-10-13T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.811408 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.811490 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.811438 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:13 crc kubenswrapper[4974]: E1013 18:16:13.811720 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.811881 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:13 crc kubenswrapper[4974]: E1013 18:16:13.811872 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:13 crc kubenswrapper[4974]: E1013 18:16:13.811975 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:13 crc kubenswrapper[4974]: E1013 18:16:13.812088 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.854805 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.854866 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.854884 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.854911 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.854929 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:13Z","lastTransitionTime":"2025-10-13T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.958928 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.958999 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.959017 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.959043 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:13 crc kubenswrapper[4974]: I1013 18:16:13.959061 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:13Z","lastTransitionTime":"2025-10-13T18:16:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.010602 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.010690 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.010717 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.010748 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.010772 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: E1013 18:16:14.032368 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.038792 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.038861 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.038884 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.038913 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.038933 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: E1013 18:16:14.059922 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.073078 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.073154 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.073197 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.073235 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.073257 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: E1013 18:16:14.097747 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.103613 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.103712 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.103739 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.103774 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.103802 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: E1013 18:16:14.125852 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.130964 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.131016 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.131034 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.131057 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.131075 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: E1013 18:16:14.152236 4974 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-10-13T18:16:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"71c32b41-77fa-4116-87b2-213f1ff9d252\\\",\\\"systemUUID\\\":\\\"7fb80a93-bf09-453c-9c6a-784a87b26241\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-10-13T18:16:14Z is after 2025-08-24T17:21:41Z" Oct 13 18:16:14 crc kubenswrapper[4974]: E1013 18:16:14.152687 4974 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.155516 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.155590 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.155610 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.155640 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.155704 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.259305 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.259368 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.259386 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.259412 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.259430 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.363268 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.363408 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.363436 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.363458 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.363527 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.470075 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.470149 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.470171 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.470200 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.470225 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.573795 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.573836 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.573848 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.573866 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.573880 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.677885 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.677980 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.678012 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.678050 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.678071 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.780372 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.780431 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.780449 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.780508 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.780527 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.883777 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.883853 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.883874 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.883903 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.883922 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.986140 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.986168 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.986176 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.986188 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:14 crc kubenswrapper[4974]: I1013 18:16:14.986197 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:14Z","lastTransitionTime":"2025-10-13T18:16:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.089715 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.089780 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.089798 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.089822 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.089840 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:15Z","lastTransitionTime":"2025-10-13T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.192223 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.192318 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.192338 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.192361 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.192378 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:15Z","lastTransitionTime":"2025-10-13T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.294808 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.294862 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.294878 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.294900 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.294917 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:15Z","lastTransitionTime":"2025-10-13T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.399212 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.399293 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.399347 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.399385 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.399411 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:15Z","lastTransitionTime":"2025-10-13T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.501942 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.502024 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.502048 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.502080 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.502102 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:15Z","lastTransitionTime":"2025-10-13T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.605164 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.605222 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.605239 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.605263 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.605281 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:15Z","lastTransitionTime":"2025-10-13T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.707716 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.707818 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.707836 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.707860 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.707877 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:15Z","lastTransitionTime":"2025-10-13T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.810555 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.810626 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.810690 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.810727 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.810755 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:15Z","lastTransitionTime":"2025-10-13T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.810805 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.810886 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.810724 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:15 crc kubenswrapper[4974]: E1013 18:16:15.811017 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:15 crc kubenswrapper[4974]: E1013 18:16:15.811206 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.811304 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:15 crc kubenswrapper[4974]: E1013 18:16:15.811453 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:15 crc kubenswrapper[4974]: E1013 18:16:15.811585 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.873734 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gwv4w" podStartSLOduration=88.873710068 podStartE2EDuration="1m28.873710068s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:15.872791482 +0000 UTC m=+110.777157622" watchObservedRunningTime="2025-10-13 18:16:15.873710068 +0000 UTC m=+110.778076178" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.907479 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=87.907445964 podStartE2EDuration="1m27.907445964s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:15.902415573 +0000 UTC m=+110.806781723" watchObservedRunningTime="2025-10-13 18:16:15.907445964 +0000 UTC m=+110.811812084" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.914450 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.914505 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.914523 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.914548 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.914567 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:15Z","lastTransitionTime":"2025-10-13T18:16:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.961736 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xcspx" podStartSLOduration=88.961702887 podStartE2EDuration="1m28.961702887s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:15.944529095 +0000 UTC m=+110.848895215" watchObservedRunningTime="2025-10-13 18:16:15.961702887 +0000 UTC m=+110.866068977" Oct 13 18:16:15 crc kubenswrapper[4974]: I1013 18:16:15.961935 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=90.961927183 podStartE2EDuration="1m30.961927183s" podCreationTimestamp="2025-10-13 18:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:15.96148024 +0000 UTC m=+110.865846350" watchObservedRunningTime="2025-10-13 18:16:15.961927183 +0000 UTC m=+110.866293273" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.016046 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podStartSLOduration=89.016024961 podStartE2EDuration="1m29.016024961s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:16.015756053 +0000 UTC m=+110.920122173" watchObservedRunningTime="2025-10-13 18:16:16.016024961 +0000 UTC m=+110.920391041" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.017132 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.017181 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.017194 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.017214 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.017229 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:16Z","lastTransitionTime":"2025-10-13T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.062202 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-98z75" podStartSLOduration=89.062167736 podStartE2EDuration="1m29.062167736s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:16.032137813 +0000 UTC m=+110.936503913" watchObservedRunningTime="2025-10-13 18:16:16.062167736 +0000 UTC m=+110.966533856" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.081724 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=55.081701914 podStartE2EDuration="55.081701914s" podCreationTimestamp="2025-10-13 18:15:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:16.081593371 +0000 UTC m=+110.985959461" watchObservedRunningTime="2025-10-13 18:16:16.081701914 +0000 UTC m=+110.986068034" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.119312 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.119371 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.119389 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.119413 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.119431 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:16Z","lastTransitionTime":"2025-10-13T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.135103 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-r5fj5" podStartSLOduration=89.135073591 podStartE2EDuration="1m29.135073591s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:16.122958141 +0000 UTC m=+111.027324251" watchObservedRunningTime="2025-10-13 18:16:16.135073591 +0000 UTC m=+111.039439721" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.135628 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-rt4jb" podStartSLOduration=88.135616447 podStartE2EDuration="1m28.135616447s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:16.13468005 +0000 UTC m=+111.039046170" watchObservedRunningTime="2025-10-13 18:16:16.135616447 +0000 UTC m=+111.039982577" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.166009 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=18.165983929 podStartE2EDuration="18.165983929s" podCreationTimestamp="2025-10-13 18:15:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:16.165630149 +0000 UTC m=+111.069996269" watchObservedRunningTime="2025-10-13 18:16:16.165983929 +0000 UTC m=+111.070350049" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.195796 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=91.195771574 podStartE2EDuration="1m31.195771574s" podCreationTimestamp="2025-10-13 18:14:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:16.194342144 +0000 UTC m=+111.098708284" watchObservedRunningTime="2025-10-13 18:16:16.195771574 +0000 UTC m=+111.100137684" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.222429 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.222492 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.222512 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.222536 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.222557 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:16Z","lastTransitionTime":"2025-10-13T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.324932 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.324976 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.324987 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.325004 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.325016 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:16Z","lastTransitionTime":"2025-10-13T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.428338 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.428391 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.428407 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.428430 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.428446 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:16Z","lastTransitionTime":"2025-10-13T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.531370 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.531439 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.531452 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.531492 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.531509 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:16Z","lastTransitionTime":"2025-10-13T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.635128 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.635174 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.635183 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.635197 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.635209 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:16Z","lastTransitionTime":"2025-10-13T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.739294 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.739355 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.739367 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.739386 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.739398 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:16Z","lastTransitionTime":"2025-10-13T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.841830 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.842263 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.842444 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.842601 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.842835 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:16Z","lastTransitionTime":"2025-10-13T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.946751 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.946941 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.947047 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.947090 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:16 crc kubenswrapper[4974]: I1013 18:16:16.947167 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:16Z","lastTransitionTime":"2025-10-13T18:16:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.049879 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.050258 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.050440 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.050780 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.050985 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:17Z","lastTransitionTime":"2025-10-13T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.154119 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.154204 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.154241 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.154279 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.154304 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:17Z","lastTransitionTime":"2025-10-13T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.257685 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.257769 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.257799 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.257832 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.257854 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:17Z","lastTransitionTime":"2025-10-13T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.361165 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.361217 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.361235 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.361257 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.361276 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:17Z","lastTransitionTime":"2025-10-13T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.464506 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.464567 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.464586 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.464610 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.464626 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:17Z","lastTransitionTime":"2025-10-13T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.567556 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.567748 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.567777 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.567810 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.567835 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:17Z","lastTransitionTime":"2025-10-13T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.671475 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.671790 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.671866 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.671891 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.671910 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:17Z","lastTransitionTime":"2025-10-13T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.775627 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.775732 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.775751 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.775774 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.775925 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:17Z","lastTransitionTime":"2025-10-13T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.811459 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.811505 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.811637 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:17 crc kubenswrapper[4974]: E1013 18:16:17.811841 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.811899 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:17 crc kubenswrapper[4974]: E1013 18:16:17.812115 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:17 crc kubenswrapper[4974]: E1013 18:16:17.812231 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:17 crc kubenswrapper[4974]: E1013 18:16:17.812397 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.879077 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.879160 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.879197 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.879228 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.879251 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:17Z","lastTransitionTime":"2025-10-13T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.982304 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.982367 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.982383 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.982408 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:17 crc kubenswrapper[4974]: I1013 18:16:17.982426 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:17Z","lastTransitionTime":"2025-10-13T18:16:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.085398 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.085470 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.085489 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.085516 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.085533 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:18Z","lastTransitionTime":"2025-10-13T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.187829 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.187867 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.187878 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.187894 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.187906 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:18Z","lastTransitionTime":"2025-10-13T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.290559 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.290611 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.290628 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.290679 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.290715 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:18Z","lastTransitionTime":"2025-10-13T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.393301 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.393350 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.393363 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.393388 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.393401 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:18Z","lastTransitionTime":"2025-10-13T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.496447 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.496506 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.496527 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.496561 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.496583 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:18Z","lastTransitionTime":"2025-10-13T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.599907 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.599986 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.600013 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.600043 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.600067 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:18Z","lastTransitionTime":"2025-10-13T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.703256 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.703329 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.703357 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.703388 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.703408 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:18Z","lastTransitionTime":"2025-10-13T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.806146 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.806216 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.806235 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.806260 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.806278 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:18Z","lastTransitionTime":"2025-10-13T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.909486 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.909543 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.909560 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.909582 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:18 crc kubenswrapper[4974]: I1013 18:16:18.909599 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:18Z","lastTransitionTime":"2025-10-13T18:16:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.015159 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.015226 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.015244 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.015268 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.015286 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:19Z","lastTransitionTime":"2025-10-13T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.119333 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.119403 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.119417 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.119439 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.119459 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:19Z","lastTransitionTime":"2025-10-13T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.222501 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.222552 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.222567 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.222587 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.222602 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:19Z","lastTransitionTime":"2025-10-13T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.325904 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.325945 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.325959 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.325979 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.325993 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:19Z","lastTransitionTime":"2025-10-13T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.429116 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.429501 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.429740 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.429925 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.430083 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:19Z","lastTransitionTime":"2025-10-13T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.532860 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.533161 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.533247 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.533342 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.533438 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:19Z","lastTransitionTime":"2025-10-13T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.637108 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.637582 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.637800 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.637961 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.638107 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:19Z","lastTransitionTime":"2025-10-13T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.741312 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.741376 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.741400 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.741430 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.741453 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:19Z","lastTransitionTime":"2025-10-13T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.811388 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.811398 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.811467 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:19 crc kubenswrapper[4974]: E1013 18:16:19.811601 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.811760 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:19 crc kubenswrapper[4974]: E1013 18:16:19.811897 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:19 crc kubenswrapper[4974]: E1013 18:16:19.812071 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:19 crc kubenswrapper[4974]: E1013 18:16:19.812213 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.844159 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.844255 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.844282 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.844312 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.844333 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:19Z","lastTransitionTime":"2025-10-13T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.948133 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.948197 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.948214 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.948238 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:19 crc kubenswrapper[4974]: I1013 18:16:19.948256 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:19Z","lastTransitionTime":"2025-10-13T18:16:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.051790 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.051833 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.051841 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.051856 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.051874 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:20Z","lastTransitionTime":"2025-10-13T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.154457 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.154529 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.154547 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.154571 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.154588 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:20Z","lastTransitionTime":"2025-10-13T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.257953 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.258021 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.258039 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.258066 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.258086 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:20Z","lastTransitionTime":"2025-10-13T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.361244 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.361284 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.361294 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.361313 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.361325 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:20Z","lastTransitionTime":"2025-10-13T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.464542 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.464605 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.464636 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.464697 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.464722 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:20Z","lastTransitionTime":"2025-10-13T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.567950 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.568006 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.568022 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.568047 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.568066 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:20Z","lastTransitionTime":"2025-10-13T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.671451 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.671502 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.671522 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.671545 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.671563 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:20Z","lastTransitionTime":"2025-10-13T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.774581 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.774703 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.774724 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.774748 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.774768 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:20Z","lastTransitionTime":"2025-10-13T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.878032 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.878082 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.878095 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.878111 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.878123 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:20Z","lastTransitionTime":"2025-10-13T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.981245 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.981305 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.981322 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.981347 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:20 crc kubenswrapper[4974]: I1013 18:16:20.981364 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:20Z","lastTransitionTime":"2025-10-13T18:16:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.084536 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.084591 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.084614 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.084641 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.084697 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:21Z","lastTransitionTime":"2025-10-13T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.188058 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.188158 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.188188 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.188245 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.188268 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:21Z","lastTransitionTime":"2025-10-13T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.291315 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.291376 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.291392 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.291416 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.291435 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:21Z","lastTransitionTime":"2025-10-13T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.394514 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.394585 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.394603 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.394632 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.394688 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:21Z","lastTransitionTime":"2025-10-13T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.459988 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xcspx_9c38c0e3-9bee-402b-adf0-27ac9e31c0f0/kube-multus/1.log" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.460493 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xcspx_9c38c0e3-9bee-402b-adf0-27ac9e31c0f0/kube-multus/0.log" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.460564 4974 generic.go:334] "Generic (PLEG): container finished" podID="9c38c0e3-9bee-402b-adf0-27ac9e31c0f0" containerID="e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94" exitCode=1 Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.460618 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xcspx" event={"ID":"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0","Type":"ContainerDied","Data":"e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94"} Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.460729 4974 scope.go:117] "RemoveContainer" containerID="e48877090dfd8ff2ef39af0672f39f35c7fc60d19f29fb4754275a0fa736b490" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.461040 4974 scope.go:117] "RemoveContainer" containerID="e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94" Oct 13 18:16:21 crc kubenswrapper[4974]: E1013 18:16:21.461199 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xcspx_openshift-multus(9c38c0e3-9bee-402b-adf0-27ac9e31c0f0)\"" pod="openshift-multus/multus-xcspx" podUID="9c38c0e3-9bee-402b-adf0-27ac9e31c0f0" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.497719 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.497836 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.497856 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.497886 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.497904 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:21Z","lastTransitionTime":"2025-10-13T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.600206 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.600247 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.600261 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.600285 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.600300 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:21Z","lastTransitionTime":"2025-10-13T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.713071 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.713155 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.713179 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.713207 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.713229 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:21Z","lastTransitionTime":"2025-10-13T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.810611 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:21 crc kubenswrapper[4974]: E1013 18:16:21.810815 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.810822 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.810884 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.810898 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:21 crc kubenswrapper[4974]: E1013 18:16:21.811566 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:21 crc kubenswrapper[4974]: E1013 18:16:21.811829 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:21 crc kubenswrapper[4974]: E1013 18:16:21.811915 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.812830 4974 scope.go:117] "RemoveContainer" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" Oct 13 18:16:21 crc kubenswrapper[4974]: E1013 18:16:21.813350 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-zwcs8_openshift-ovn-kubernetes(d9f54cc7-5b3b-4481-9be5-f03df1854435)\"" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.815580 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.815618 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.815634 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.815686 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.815703 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:21Z","lastTransitionTime":"2025-10-13T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.918503 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.918611 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.918632 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.918725 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:21 crc kubenswrapper[4974]: I1013 18:16:21.918745 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:21Z","lastTransitionTime":"2025-10-13T18:16:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.023041 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.023461 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.023695 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.023913 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.024105 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:22Z","lastTransitionTime":"2025-10-13T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.127957 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.128030 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.128054 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.128084 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.128107 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:22Z","lastTransitionTime":"2025-10-13T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.231119 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.231495 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.231636 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.231844 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.231996 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:22Z","lastTransitionTime":"2025-10-13T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.335307 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.335588 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.335677 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.335751 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.335813 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:22Z","lastTransitionTime":"2025-10-13T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.438784 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.439042 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.439192 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.439349 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.439480 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:22Z","lastTransitionTime":"2025-10-13T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.466630 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xcspx_9c38c0e3-9bee-402b-adf0-27ac9e31c0f0/kube-multus/1.log" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.542037 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.542097 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.542112 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.542135 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.542152 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:22Z","lastTransitionTime":"2025-10-13T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.644561 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.644630 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.644699 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.644730 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.644753 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:22Z","lastTransitionTime":"2025-10-13T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.747807 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.747905 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.747939 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.747968 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.747989 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:22Z","lastTransitionTime":"2025-10-13T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.851052 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.851113 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.851131 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.851152 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.851169 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:22Z","lastTransitionTime":"2025-10-13T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.954555 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.954637 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.954735 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.954770 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:22 crc kubenswrapper[4974]: I1013 18:16:22.954792 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:22Z","lastTransitionTime":"2025-10-13T18:16:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.057361 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.057469 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.057496 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.057528 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.057550 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:23Z","lastTransitionTime":"2025-10-13T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.160716 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.160773 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.160799 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.160829 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.160851 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:23Z","lastTransitionTime":"2025-10-13T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.263812 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.263890 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.263914 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.263973 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.264032 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:23Z","lastTransitionTime":"2025-10-13T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.367747 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.367819 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.367837 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.367868 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.367889 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:23Z","lastTransitionTime":"2025-10-13T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.471204 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.471274 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.471291 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.471377 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.471400 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:23Z","lastTransitionTime":"2025-10-13T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.574892 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.574966 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.575053 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.575082 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.575100 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:23Z","lastTransitionTime":"2025-10-13T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.678216 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.678291 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.678317 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.678349 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.678373 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:23Z","lastTransitionTime":"2025-10-13T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.781777 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.781834 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.781852 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.781876 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.781895 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:23Z","lastTransitionTime":"2025-10-13T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.811523 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.811536 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:23 crc kubenswrapper[4974]: E1013 18:16:23.811755 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.811790 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.811890 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:23 crc kubenswrapper[4974]: E1013 18:16:23.812054 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:23 crc kubenswrapper[4974]: E1013 18:16:23.812141 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:23 crc kubenswrapper[4974]: E1013 18:16:23.812294 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.884923 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.884973 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.884990 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.885012 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.885030 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:23Z","lastTransitionTime":"2025-10-13T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.988056 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.988110 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.988126 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.988147 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:23 crc kubenswrapper[4974]: I1013 18:16:23.988164 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:23Z","lastTransitionTime":"2025-10-13T18:16:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.091072 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.091149 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.091167 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.091195 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.091212 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:24Z","lastTransitionTime":"2025-10-13T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.194155 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.194215 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.194238 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.194272 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.194295 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:24Z","lastTransitionTime":"2025-10-13T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.297404 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.297507 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.297547 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.297573 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.297589 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:24Z","lastTransitionTime":"2025-10-13T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.401077 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.401193 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.401221 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.401255 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.401277 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:24Z","lastTransitionTime":"2025-10-13T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.504602 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.504683 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.504708 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.504740 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.504760 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:24Z","lastTransitionTime":"2025-10-13T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.532264 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.532348 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.532386 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.532417 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.532440 4974 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-10-13T18:16:24Z","lastTransitionTime":"2025-10-13T18:16:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.589404 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m"] Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.590144 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.592527 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.592715 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.592780 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.592888 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.701944 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.701987 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.702021 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.702056 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.702075 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.803082 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.803181 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.803270 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.803327 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.803408 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.803501 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.805152 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-service-ca\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.805265 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.819844 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.832823 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-rm86m\" (UID: \"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: I1013 18:16:24.907285 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" Oct 13 18:16:24 crc kubenswrapper[4974]: W1013 18:16:24.928761 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3ab2e44_a8d6_4fab_bfbc_d3c27fdcd4fb.slice/crio-1331c4cf26308dbdf1fbb16a6876838606d9beeef48e9cbf84ce7e9a8527a73b WatchSource:0}: Error finding container 1331c4cf26308dbdf1fbb16a6876838606d9beeef48e9cbf84ce7e9a8527a73b: Status 404 returned error can't find the container with id 1331c4cf26308dbdf1fbb16a6876838606d9beeef48e9cbf84ce7e9a8527a73b Oct 13 18:16:25 crc kubenswrapper[4974]: I1013 18:16:25.482713 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" event={"ID":"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb","Type":"ContainerStarted","Data":"3963cc2d0629a0d75af93eaf25bf99fb8b87e22285b2691cfc1e805d5052e226"} Oct 13 18:16:25 crc kubenswrapper[4974]: I1013 18:16:25.483141 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" event={"ID":"e3ab2e44-a8d6-4fab-bfbc-d3c27fdcd4fb","Type":"ContainerStarted","Data":"1331c4cf26308dbdf1fbb16a6876838606d9beeef48e9cbf84ce7e9a8527a73b"} Oct 13 18:16:25 crc kubenswrapper[4974]: I1013 18:16:25.504770 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-rm86m" podStartSLOduration=98.504743514 podStartE2EDuration="1m38.504743514s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:25.503212781 +0000 UTC m=+120.407578891" watchObservedRunningTime="2025-10-13 18:16:25.504743514 +0000 UTC m=+120.409109624" Oct 13 18:16:25 crc kubenswrapper[4974]: E1013 18:16:25.742436 4974 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Oct 13 18:16:25 crc kubenswrapper[4974]: I1013 18:16:25.810934 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:25 crc kubenswrapper[4974]: I1013 18:16:25.810992 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:25 crc kubenswrapper[4974]: I1013 18:16:25.810954 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:25 crc kubenswrapper[4974]: E1013 18:16:25.814278 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:25 crc kubenswrapper[4974]: I1013 18:16:25.814702 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:25 crc kubenswrapper[4974]: E1013 18:16:25.814868 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:25 crc kubenswrapper[4974]: E1013 18:16:25.815211 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:25 crc kubenswrapper[4974]: E1013 18:16:25.815571 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:25 crc kubenswrapper[4974]: E1013 18:16:25.916175 4974 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 13 18:16:27 crc kubenswrapper[4974]: I1013 18:16:27.810617 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:27 crc kubenswrapper[4974]: I1013 18:16:27.810800 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:27 crc kubenswrapper[4974]: E1013 18:16:27.810890 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:27 crc kubenswrapper[4974]: I1013 18:16:27.810917 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:27 crc kubenswrapper[4974]: I1013 18:16:27.811002 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:27 crc kubenswrapper[4974]: E1013 18:16:27.811100 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:27 crc kubenswrapper[4974]: E1013 18:16:27.811261 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:27 crc kubenswrapper[4974]: E1013 18:16:27.811367 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:29 crc kubenswrapper[4974]: I1013 18:16:29.811070 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:29 crc kubenswrapper[4974]: E1013 18:16:29.811788 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:29 crc kubenswrapper[4974]: I1013 18:16:29.811214 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:29 crc kubenswrapper[4974]: E1013 18:16:29.811987 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:29 crc kubenswrapper[4974]: I1013 18:16:29.811164 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:29 crc kubenswrapper[4974]: E1013 18:16:29.812164 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:29 crc kubenswrapper[4974]: I1013 18:16:29.811248 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:29 crc kubenswrapper[4974]: E1013 18:16:29.812376 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:30 crc kubenswrapper[4974]: E1013 18:16:30.917705 4974 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 13 18:16:31 crc kubenswrapper[4974]: I1013 18:16:31.811493 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:31 crc kubenswrapper[4974]: I1013 18:16:31.811681 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:31 crc kubenswrapper[4974]: E1013 18:16:31.811891 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:31 crc kubenswrapper[4974]: I1013 18:16:31.811961 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:31 crc kubenswrapper[4974]: I1013 18:16:31.811936 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:31 crc kubenswrapper[4974]: E1013 18:16:31.812062 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:31 crc kubenswrapper[4974]: E1013 18:16:31.812195 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:31 crc kubenswrapper[4974]: E1013 18:16:31.812326 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:33 crc kubenswrapper[4974]: I1013 18:16:33.811594 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:33 crc kubenswrapper[4974]: I1013 18:16:33.811756 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:33 crc kubenswrapper[4974]: E1013 18:16:33.811821 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:33 crc kubenswrapper[4974]: I1013 18:16:33.811875 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:33 crc kubenswrapper[4974]: E1013 18:16:33.811942 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:33 crc kubenswrapper[4974]: I1013 18:16:33.812018 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:33 crc kubenswrapper[4974]: E1013 18:16:33.812585 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:33 crc kubenswrapper[4974]: E1013 18:16:33.812997 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:33 crc kubenswrapper[4974]: I1013 18:16:33.813130 4974 scope.go:117] "RemoveContainer" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" Oct 13 18:16:34 crc kubenswrapper[4974]: I1013 18:16:34.517753 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/3.log" Oct 13 18:16:34 crc kubenswrapper[4974]: I1013 18:16:34.520594 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerStarted","Data":"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8"} Oct 13 18:16:34 crc kubenswrapper[4974]: I1013 18:16:34.521711 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:16:34 crc kubenswrapper[4974]: I1013 18:16:34.668505 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podStartSLOduration=107.668481416 podStartE2EDuration="1m47.668481416s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:34.56561182 +0000 UTC m=+129.469977900" watchObservedRunningTime="2025-10-13 18:16:34.668481416 +0000 UTC m=+129.572847506" Oct 13 18:16:34 crc kubenswrapper[4974]: I1013 18:16:34.668926 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z9hj4"] Oct 13 18:16:34 crc kubenswrapper[4974]: I1013 18:16:34.669045 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:34 crc kubenswrapper[4974]: E1013 18:16:34.669151 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:35 crc kubenswrapper[4974]: I1013 18:16:35.811168 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:35 crc kubenswrapper[4974]: I1013 18:16:35.811217 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:35 crc kubenswrapper[4974]: I1013 18:16:35.811391 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:35 crc kubenswrapper[4974]: I1013 18:16:35.811609 4974 scope.go:117] "RemoveContainer" containerID="e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94" Oct 13 18:16:35 crc kubenswrapper[4974]: E1013 18:16:35.813285 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:35 crc kubenswrapper[4974]: E1013 18:16:35.813345 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:35 crc kubenswrapper[4974]: E1013 18:16:35.813500 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:35 crc kubenswrapper[4974]: E1013 18:16:35.919339 4974 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Oct 13 18:16:36 crc kubenswrapper[4974]: I1013 18:16:36.532101 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xcspx_9c38c0e3-9bee-402b-adf0-27ac9e31c0f0/kube-multus/1.log" Oct 13 18:16:36 crc kubenswrapper[4974]: I1013 18:16:36.532202 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xcspx" event={"ID":"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0","Type":"ContainerStarted","Data":"a51eb90e50915e1d2d4940bd5a7e011787bd8e414e7a1abfc7d70efc8c48343d"} Oct 13 18:16:36 crc kubenswrapper[4974]: I1013 18:16:36.810724 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:36 crc kubenswrapper[4974]: E1013 18:16:36.811242 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:37 crc kubenswrapper[4974]: I1013 18:16:37.811510 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:37 crc kubenswrapper[4974]: E1013 18:16:37.811791 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:37 crc kubenswrapper[4974]: I1013 18:16:37.811836 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:37 crc kubenswrapper[4974]: E1013 18:16:37.812019 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:37 crc kubenswrapper[4974]: I1013 18:16:37.812787 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:37 crc kubenswrapper[4974]: E1013 18:16:37.813057 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:38 crc kubenswrapper[4974]: I1013 18:16:38.811134 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:38 crc kubenswrapper[4974]: E1013 18:16:38.811407 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:39 crc kubenswrapper[4974]: I1013 18:16:39.811104 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:39 crc kubenswrapper[4974]: I1013 18:16:39.811150 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:39 crc kubenswrapper[4974]: I1013 18:16:39.811188 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:39 crc kubenswrapper[4974]: E1013 18:16:39.811291 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Oct 13 18:16:39 crc kubenswrapper[4974]: E1013 18:16:39.811405 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Oct 13 18:16:39 crc kubenswrapper[4974]: E1013 18:16:39.811512 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Oct 13 18:16:40 crc kubenswrapper[4974]: I1013 18:16:40.811360 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:40 crc kubenswrapper[4974]: E1013 18:16:40.811580 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-z9hj4" podUID="a260247c-2399-42b5-bddc-73e38659680b" Oct 13 18:16:41 crc kubenswrapper[4974]: I1013 18:16:41.810927 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:41 crc kubenswrapper[4974]: I1013 18:16:41.810949 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:41 crc kubenswrapper[4974]: I1013 18:16:41.810949 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:41 crc kubenswrapper[4974]: I1013 18:16:41.813982 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Oct 13 18:16:41 crc kubenswrapper[4974]: I1013 18:16:41.814331 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Oct 13 18:16:41 crc kubenswrapper[4974]: I1013 18:16:41.814371 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Oct 13 18:16:41 crc kubenswrapper[4974]: I1013 18:16:41.816720 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Oct 13 18:16:42 crc kubenswrapper[4974]: I1013 18:16:42.810688 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:16:42 crc kubenswrapper[4974]: I1013 18:16:42.813937 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Oct 13 18:16:42 crc kubenswrapper[4974]: I1013 18:16:42.814173 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Oct 13 18:16:43 crc kubenswrapper[4974]: I1013 18:16:43.424102 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.184984 4974 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.240782 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mlpn9"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.242001 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.242029 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2qjrb"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.242964 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.243685 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-knlnk"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.244415 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.248456 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.249029 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.252202 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.254120 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.267965 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.268887 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.268986 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.269271 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.269470 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.269561 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.269814 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.269880 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.269571 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270177 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270314 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-jbznq"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270367 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.271069 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270449 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270555 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.271347 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270625 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270717 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270713 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270768 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270788 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270792 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270838 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270898 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270911 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.270993 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.271020 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.271034 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.272218 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.277286 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.277930 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.278050 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.278157 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.278274 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.278385 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.278608 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.280472 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gvsws"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.281063 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.284734 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.285675 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.288790 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.294352 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.297897 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.297944 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j96z\" (UniqueName: \"kubernetes.io/projected/ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd-kube-api-access-6j96z\") pod \"cluster-samples-operator-665b6dd947-22fnv\" (UID: \"ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.297972 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dqqw\" (UniqueName: \"kubernetes.io/projected/5b94a42e-d2ac-46e7-a400-3702e7f5f261-kube-api-access-8dqqw\") pod \"machine-approver-56656f9798-x24fx\" (UID: \"5b94a42e-d2ac-46e7-a400-3702e7f5f261\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.297995 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce0c606d-4062-4f6a-afec-752440b5580c-console-oauth-config\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298015 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76842663-4197-4c71-8601-6a657814388b-client-ca\") pod \"route-controller-manager-6576b87f9c-mqg2t\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298039 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bc5f4140-1f56-472e-95ed-cf3d4fb85f45-images\") pod \"machine-api-operator-5694c8668f-knlnk\" (UID: \"bc5f4140-1f56-472e-95ed-cf3d4fb85f45\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298061 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce0c606d-4062-4f6a-afec-752440b5580c-console-serving-cert\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298079 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-trusted-ca-bundle\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298114 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298134 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298158 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc5f4140-1f56-472e-95ed-cf3d4fb85f45-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-knlnk\" (UID: \"bc5f4140-1f56-472e-95ed-cf3d4fb85f45\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298178 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-serving-cert\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298197 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-audit-policies\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298219 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298237 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4dff75c-23a2-40e2-9259-e056682367d7-serving-cert\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298257 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5tmd\" (UniqueName: \"kubernetes.io/projected/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-kube-api-access-n5tmd\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298276 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-service-ca\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298296 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b94a42e-d2ac-46e7-a400-3702e7f5f261-auth-proxy-config\") pod \"machine-approver-56656f9798-x24fx\" (UID: \"5b94a42e-d2ac-46e7-a400-3702e7f5f261\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298317 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-image-import-ca\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298338 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298359 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-22fnv\" (UID: \"ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298378 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-encryption-config\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298398 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-audit-dir\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298417 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b6539e39-b08d-4c27-a689-6401b299e123-audit-dir\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298446 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-console-config\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298466 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vhrm\" (UniqueName: \"kubernetes.io/projected/b6539e39-b08d-4c27-a689-6401b299e123-kube-api-access-2vhrm\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298487 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298507 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b94a42e-d2ac-46e7-a400-3702e7f5f261-config\") pod \"machine-approver-56656f9798-x24fx\" (UID: \"5b94a42e-d2ac-46e7-a400-3702e7f5f261\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298536 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5b94a42e-d2ac-46e7-a400-3702e7f5f261-machine-approver-tls\") pod \"machine-approver-56656f9798-x24fx\" (UID: \"5b94a42e-d2ac-46e7-a400-3702e7f5f261\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298556 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-config\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298577 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-config\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298597 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298624 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-etcd-client\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298670 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc5f4140-1f56-472e-95ed-cf3d4fb85f45-config\") pod \"machine-api-operator-5694c8668f-knlnk\" (UID: \"bc5f4140-1f56-472e-95ed-cf3d4fb85f45\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298692 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-etcd-serving-ca\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298712 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298732 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pdlf\" (UniqueName: \"kubernetes.io/projected/f4dff75c-23a2-40e2-9259-e056682367d7-kube-api-access-2pdlf\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298751 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-node-pullsecrets\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298772 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-oauth-serving-cert\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298793 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-client-ca\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298817 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9lr4\" (UniqueName: \"kubernetes.io/projected/76842663-4197-4c71-8601-6a657814388b-kube-api-access-q9lr4\") pod \"route-controller-manager-6576b87f9c-mqg2t\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298839 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-audit\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298859 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298877 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76842663-4197-4c71-8601-6a657814388b-config\") pod \"route-controller-manager-6576b87f9c-mqg2t\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298897 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlxns\" (UniqueName: \"kubernetes.io/projected/ce0c606d-4062-4f6a-afec-752440b5580c-kube-api-access-wlxns\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298915 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76842663-4197-4c71-8601-6a657814388b-serving-cert\") pod \"route-controller-manager-6576b87f9c-mqg2t\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298935 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298958 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rfpn\" (UniqueName: \"kubernetes.io/projected/bc5f4140-1f56-472e-95ed-cf3d4fb85f45-kube-api-access-9rfpn\") pod \"machine-api-operator-5694c8668f-knlnk\" (UID: \"bc5f4140-1f56-472e-95ed-cf3d4fb85f45\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298978 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.298999 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.299021 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.299788 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.300284 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.300497 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.300701 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.300824 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.300869 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.300966 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.301065 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.301144 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.301171 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.300836 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.301252 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.301296 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.301335 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.301235 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.301258 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.301535 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.302514 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.302801 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.302909 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.303378 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-khv5p"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.304037 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.304572 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.304976 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.305159 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.305312 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-rr898"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.305323 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.305739 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rr898" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.308093 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tp4tf"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.308547 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dsjgz"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.308871 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.309133 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.309168 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-dsjgz" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.309141 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.310744 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xgp8t"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.311037 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.311412 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.311784 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.311795 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.312206 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.328189 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.328477 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nlffn"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.329438 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.330224 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.331215 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.331465 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.339585 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.342406 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.342125 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.343718 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.344181 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.344504 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.344767 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.344896 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.345191 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.345298 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.345547 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.345891 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.349006 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.361282 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.361617 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9p2fb"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.361969 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.362334 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.362491 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.363939 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.364163 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.364264 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.364366 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.364444 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.364529 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.365608 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.366037 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.366158 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.366471 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.366555 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.366679 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.366834 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.366951 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2qjrb"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.366982 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.370184 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.370775 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.371387 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.371541 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.372954 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.373440 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.374163 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.374397 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.374552 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.374711 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.374855 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.374972 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.375303 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.375454 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.384619 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.386586 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.388890 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.389613 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.389996 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.390020 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399402 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399435 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/820c88f3-f632-4375-b569-d796628c8f73-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399454 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j96z\" (UniqueName: \"kubernetes.io/projected/ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd-kube-api-access-6j96z\") pod \"cluster-samples-operator-665b6dd947-22fnv\" (UID: \"ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399474 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dqqw\" (UniqueName: \"kubernetes.io/projected/5b94a42e-d2ac-46e7-a400-3702e7f5f261-kube-api-access-8dqqw\") pod \"machine-approver-56656f9798-x24fx\" (UID: \"5b94a42e-d2ac-46e7-a400-3702e7f5f261\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399493 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce0c606d-4062-4f6a-afec-752440b5580c-console-oauth-config\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399509 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ffe8fb4-5a71-4ef6-aef6-96603b8e7551-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m7pjr\" (UID: \"2ffe8fb4-5a71-4ef6-aef6-96603b8e7551\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399526 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76842663-4197-4c71-8601-6a657814388b-client-ca\") pod \"route-controller-manager-6576b87f9c-mqg2t\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399540 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/820c88f3-f632-4375-b569-d796628c8f73-serving-cert\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399553 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/820c88f3-f632-4375-b569-d796628c8f73-config\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399571 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bc5f4140-1f56-472e-95ed-cf3d4fb85f45-images\") pod \"machine-api-operator-5694c8668f-knlnk\" (UID: \"bc5f4140-1f56-472e-95ed-cf3d4fb85f45\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399586 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce0c606d-4062-4f6a-afec-752440b5580c-console-serving-cert\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399599 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-trusted-ca-bundle\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399621 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399637 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399668 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba55033e-0bef-4588-bdac-820122c4fd4e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lbflg\" (UID: \"ba55033e-0bef-4588-bdac-820122c4fd4e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399686 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-serving-cert\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399716 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc5f4140-1f56-472e-95ed-cf3d4fb85f45-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-knlnk\" (UID: \"bc5f4140-1f56-472e-95ed-cf3d4fb85f45\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399732 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5tmd\" (UniqueName: \"kubernetes.io/projected/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-kube-api-access-n5tmd\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399748 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-audit-policies\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399762 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399776 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4dff75c-23a2-40e2-9259-e056682367d7-serving-cert\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399800 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfwp5\" (UniqueName: \"kubernetes.io/projected/820c88f3-f632-4375-b569-d796628c8f73-kube-api-access-rfwp5\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399815 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-service-ca\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399830 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b94a42e-d2ac-46e7-a400-3702e7f5f261-auth-proxy-config\") pod \"machine-approver-56656f9798-x24fx\" (UID: \"5b94a42e-d2ac-46e7-a400-3702e7f5f261\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399860 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ffe8fb4-5a71-4ef6-aef6-96603b8e7551-config\") pod \"kube-apiserver-operator-766d6c64bb-m7pjr\" (UID: \"2ffe8fb4-5a71-4ef6-aef6-96603b8e7551\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399875 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-image-import-ca\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.399894 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.400460 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-22fnv\" (UID: \"ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.400494 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-encryption-config\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.402523 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/820c88f3-f632-4375-b569-d796628c8f73-service-ca-bundle\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.402815 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-audit-dir\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.402833 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b6539e39-b08d-4c27-a689-6401b299e123-audit-dir\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.402850 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-console-config\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.402865 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vhrm\" (UniqueName: \"kubernetes.io/projected/b6539e39-b08d-4c27-a689-6401b299e123-kube-api-access-2vhrm\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.402880 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6tvd\" (UniqueName: \"kubernetes.io/projected/4b386d31-0a67-4d5a-910d-79a22339feaf-kube-api-access-m6tvd\") pod \"cluster-image-registry-operator-dc59b4c8b-np2sc\" (UID: \"4b386d31-0a67-4d5a-910d-79a22339feaf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.402901 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.402920 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b94a42e-d2ac-46e7-a400-3702e7f5f261-config\") pod \"machine-approver-56656f9798-x24fx\" (UID: \"5b94a42e-d2ac-46e7-a400-3702e7f5f261\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.402935 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba55033e-0bef-4588-bdac-820122c4fd4e-config\") pod \"kube-controller-manager-operator-78b949d7b-lbflg\" (UID: \"ba55033e-0bef-4588-bdac-820122c4fd4e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.403043 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5b94a42e-d2ac-46e7-a400-3702e7f5f261-machine-approver-tls\") pod \"machine-approver-56656f9798-x24fx\" (UID: \"5b94a42e-d2ac-46e7-a400-3702e7f5f261\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.403061 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-config\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.403078 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b386d31-0a67-4d5a-910d-79a22339feaf-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-np2sc\" (UID: \"4b386d31-0a67-4d5a-910d-79a22339feaf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.403279 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-config\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405560 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405601 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-etcd-client\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405624 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc5f4140-1f56-472e-95ed-cf3d4fb85f45-config\") pod \"machine-api-operator-5694c8668f-knlnk\" (UID: \"bc5f4140-1f56-472e-95ed-cf3d4fb85f45\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405639 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-etcd-serving-ca\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405669 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c139dd79-4db7-4c96-83af-378886acfcf8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b27hb\" (UID: \"c139dd79-4db7-4c96-83af-378886acfcf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405685 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c139dd79-4db7-4c96-83af-378886acfcf8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b27hb\" (UID: \"c139dd79-4db7-4c96-83af-378886acfcf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405716 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-node-pullsecrets\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405733 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405750 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pdlf\" (UniqueName: \"kubernetes.io/projected/f4dff75c-23a2-40e2-9259-e056682367d7-kube-api-access-2pdlf\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405779 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-oauth-serving-cert\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405798 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-client-ca\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405817 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9lr4\" (UniqueName: \"kubernetes.io/projected/76842663-4197-4c71-8601-6a657814388b-kube-api-access-q9lr4\") pod \"route-controller-manager-6576b87f9c-mqg2t\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405836 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b386d31-0a67-4d5a-910d-79a22339feaf-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-np2sc\" (UID: \"4b386d31-0a67-4d5a-910d-79a22339feaf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405854 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-audit\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405870 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405887 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76842663-4197-4c71-8601-6a657814388b-config\") pod \"route-controller-manager-6576b87f9c-mqg2t\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405902 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ffe8fb4-5a71-4ef6-aef6-96603b8e7551-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m7pjr\" (UID: \"2ffe8fb4-5a71-4ef6-aef6-96603b8e7551\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405933 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlxns\" (UniqueName: \"kubernetes.io/projected/ce0c606d-4062-4f6a-afec-752440b5580c-kube-api-access-wlxns\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405949 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76842663-4197-4c71-8601-6a657814388b-serving-cert\") pod \"route-controller-manager-6576b87f9c-mqg2t\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405965 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqdwk\" (UniqueName: \"kubernetes.io/projected/c139dd79-4db7-4c96-83af-378886acfcf8-kube-api-access-qqdwk\") pod \"openshift-apiserver-operator-796bbdcf4f-b27hb\" (UID: \"c139dd79-4db7-4c96-83af-378886acfcf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405981 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b386d31-0a67-4d5a-910d-79a22339feaf-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-np2sc\" (UID: \"4b386d31-0a67-4d5a-910d-79a22339feaf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.405999 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.406017 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rfpn\" (UniqueName: \"kubernetes.io/projected/bc5f4140-1f56-472e-95ed-cf3d4fb85f45-kube-api-access-9rfpn\") pod \"machine-api-operator-5694c8668f-knlnk\" (UID: \"bc5f4140-1f56-472e-95ed-cf3d4fb85f45\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.406032 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.406048 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.406064 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba55033e-0bef-4588-bdac-820122c4fd4e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lbflg\" (UID: \"ba55033e-0bef-4588-bdac-820122c4fd4e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.406083 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.407936 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.408237 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.408452 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.411370 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.432764 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b6539e39-b08d-4c27-a689-6401b299e123-audit-dir\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.435076 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-config\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.435673 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.436687 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b94a42e-d2ac-46e7-a400-3702e7f5f261-config\") pod \"machine-approver-56656f9798-x24fx\" (UID: \"5b94a42e-d2ac-46e7-a400-3702e7f5f261\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.438416 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.439085 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76842663-4197-4c71-8601-6a657814388b-client-ca\") pod \"route-controller-manager-6576b87f9c-mqg2t\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.439943 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.441480 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/bc5f4140-1f56-472e-95ed-cf3d4fb85f45-images\") pod \"machine-api-operator-5694c8668f-knlnk\" (UID: \"bc5f4140-1f56-472e-95ed-cf3d4fb85f45\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.442151 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-console-config\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.442692 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-audit\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.443181 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76842663-4197-4c71-8601-6a657814388b-config\") pod \"route-controller-manager-6576b87f9c-mqg2t\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.444543 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.445285 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-trusted-ca-bundle\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.445759 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.446490 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-service-ca\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.446959 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5b94a42e-d2ac-46e7-a400-3702e7f5f261-auth-proxy-config\") pod \"machine-approver-56656f9798-x24fx\" (UID: \"5b94a42e-d2ac-46e7-a400-3702e7f5f261\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.447316 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc5f4140-1f56-472e-95ed-cf3d4fb85f45-config\") pod \"machine-api-operator-5694c8668f-knlnk\" (UID: \"bc5f4140-1f56-472e-95ed-cf3d4fb85f45\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.447998 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-oauth-serving-cert\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.448608 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-client-ca\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.449066 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-audit-policies\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.450473 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kkk6m"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.450522 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.451434 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-node-pullsecrets\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.451488 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-etcd-serving-ca\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.451786 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.452600 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.452820 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-image-import-ca\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.453317 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76842663-4197-4c71-8601-6a657814388b-serving-cert\") pod \"route-controller-manager-6576b87f9c-mqg2t\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.453353 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-serving-cert\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.453426 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.454078 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-22fnv\" (UID: \"ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.454149 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kkk6m" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.454738 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.455255 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4dff75c-23a2-40e2-9259-e056682367d7-serving-cert\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.455606 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.412058 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-audit-dir\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.455825 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-encryption-config\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.457245 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-config\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.457280 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-trusted-ca-bundle\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.457389 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.457692 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.457782 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.457882 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.459070 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce0c606d-4062-4f6a-afec-752440b5580c-console-serving-cert\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.461343 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce0c606d-4062-4f6a-afec-752440b5580c-console-oauth-config\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.462838 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/bc5f4140-1f56-472e-95ed-cf3d4fb85f45-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-knlnk\" (UID: \"bc5f4140-1f56-472e-95ed-cf3d4fb85f45\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.463182 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-etcd-client\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.463246 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.463706 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.463981 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.464031 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/5b94a42e-d2ac-46e7-a400-3702e7f5f261-machine-approver-tls\") pod \"machine-approver-56656f9798-x24fx\" (UID: \"5b94a42e-d2ac-46e7-a400-3702e7f5f261\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.465505 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4r245"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.465779 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.472046 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.473144 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.473548 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4r245" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.473754 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.473869 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.474230 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.478573 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.481750 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.483434 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.484947 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.485111 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.487690 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wph2s"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.487797 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.488321 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.488466 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.488616 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rdq7v"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.488844 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.488962 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mlpn9"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.490795 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.489003 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.491583 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492048 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492331 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fgm8n"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492820 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tp4tf"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492839 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-knlnk"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492848 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492858 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-khv5p"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492867 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rr898"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492876 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gvsws"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492883 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492892 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492900 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492908 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xgp8t"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492917 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jbznq"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.492956 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fgm8n" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.493090 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.493402 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kkk6m"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.496771 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.497815 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.499075 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nlffn"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.500024 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4r245"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.501836 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.503432 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dsjgz"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.503480 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.504451 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.505085 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.505742 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.506925 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/820c88f3-f632-4375-b569-d796628c8f73-serving-cert\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.506952 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/820c88f3-f632-4375-b569-d796628c8f73-config\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.506971 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba55033e-0bef-4588-bdac-820122c4fd4e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lbflg\" (UID: \"ba55033e-0bef-4588-bdac-820122c4fd4e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507004 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfwp5\" (UniqueName: \"kubernetes.io/projected/820c88f3-f632-4375-b569-d796628c8f73-kube-api-access-rfwp5\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507022 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ffe8fb4-5a71-4ef6-aef6-96603b8e7551-config\") pod \"kube-apiserver-operator-766d6c64bb-m7pjr\" (UID: \"2ffe8fb4-5a71-4ef6-aef6-96603b8e7551\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507040 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/820c88f3-f632-4375-b569-d796628c8f73-service-ca-bundle\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507069 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6tvd\" (UniqueName: \"kubernetes.io/projected/4b386d31-0a67-4d5a-910d-79a22339feaf-kube-api-access-m6tvd\") pod \"cluster-image-registry-operator-dc59b4c8b-np2sc\" (UID: \"4b386d31-0a67-4d5a-910d-79a22339feaf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507088 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba55033e-0bef-4588-bdac-820122c4fd4e-config\") pod \"kube-controller-manager-operator-78b949d7b-lbflg\" (UID: \"ba55033e-0bef-4588-bdac-820122c4fd4e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507114 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b386d31-0a67-4d5a-910d-79a22339feaf-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-np2sc\" (UID: \"4b386d31-0a67-4d5a-910d-79a22339feaf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507146 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c139dd79-4db7-4c96-83af-378886acfcf8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b27hb\" (UID: \"c139dd79-4db7-4c96-83af-378886acfcf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507176 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c139dd79-4db7-4c96-83af-378886acfcf8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b27hb\" (UID: \"c139dd79-4db7-4c96-83af-378886acfcf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507200 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507206 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b386d31-0a67-4d5a-910d-79a22339feaf-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-np2sc\" (UID: \"4b386d31-0a67-4d5a-910d-79a22339feaf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507225 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqdwk\" (UniqueName: \"kubernetes.io/projected/c139dd79-4db7-4c96-83af-378886acfcf8-kube-api-access-qqdwk\") pod \"openshift-apiserver-operator-796bbdcf4f-b27hb\" (UID: \"c139dd79-4db7-4c96-83af-378886acfcf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507239 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b386d31-0a67-4d5a-910d-79a22339feaf-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-np2sc\" (UID: \"4b386d31-0a67-4d5a-910d-79a22339feaf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507254 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ffe8fb4-5a71-4ef6-aef6-96603b8e7551-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m7pjr\" (UID: \"2ffe8fb4-5a71-4ef6-aef6-96603b8e7551\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507284 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba55033e-0bef-4588-bdac-820122c4fd4e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lbflg\" (UID: \"ba55033e-0bef-4588-bdac-820122c4fd4e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507300 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/820c88f3-f632-4375-b569-d796628c8f73-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507327 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ffe8fb4-5a71-4ef6-aef6-96603b8e7551-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m7pjr\" (UID: \"2ffe8fb4-5a71-4ef6-aef6-96603b8e7551\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.507781 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/820c88f3-f632-4375-b569-d796628c8f73-service-ca-bundle\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.509261 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c139dd79-4db7-4c96-83af-378886acfcf8-config\") pod \"openshift-apiserver-operator-796bbdcf4f-b27hb\" (UID: \"c139dd79-4db7-4c96-83af-378886acfcf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.509310 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/820c88f3-f632-4375-b569-d796628c8f73-config\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.509844 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/820c88f3-f632-4375-b569-d796628c8f73-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.510319 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.511051 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c139dd79-4db7-4c96-83af-378886acfcf8-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-b27hb\" (UID: \"c139dd79-4db7-4c96-83af-378886acfcf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.511303 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/820c88f3-f632-4375-b569-d796628c8f73-serving-cert\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.511705 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.512569 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ng69k"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.515263 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.515289 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-7486c"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.515490 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.516086 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7486c" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.516595 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.518404 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.519526 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.520800 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.522212 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ng69k"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.522983 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.523709 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.523991 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.525058 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.526742 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fgm8n"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.527346 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wph2s"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.528422 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rdq7v"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.529776 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xfvq4"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.530699 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xfvq4"] Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.530739 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xfvq4" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.543441 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.570573 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.580977 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b386d31-0a67-4d5a-910d-79a22339feaf-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-np2sc\" (UID: \"4b386d31-0a67-4d5a-910d-79a22339feaf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.584225 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.604069 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.624553 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.643754 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.664259 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.685112 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.703872 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.725499 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.744176 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.763751 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.785609 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.804981 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.812149 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4b386d31-0a67-4d5a-910d-79a22339feaf-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-np2sc\" (UID: \"4b386d31-0a67-4d5a-910d-79a22339feaf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.824116 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.845147 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.852480 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2ffe8fb4-5a71-4ef6-aef6-96603b8e7551-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-m7pjr\" (UID: \"2ffe8fb4-5a71-4ef6-aef6-96603b8e7551\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.864318 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.871307 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ffe8fb4-5a71-4ef6-aef6-96603b8e7551-config\") pod \"kube-apiserver-operator-766d6c64bb-m7pjr\" (UID: \"2ffe8fb4-5a71-4ef6-aef6-96603b8e7551\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.884710 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.948078 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.948209 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.948308 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.955821 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba55033e-0bef-4588-bdac-820122c4fd4e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-lbflg\" (UID: \"ba55033e-0bef-4588-bdac-820122c4fd4e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.964814 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Oct 13 18:16:45 crc kubenswrapper[4974]: I1013 18:16:45.969136 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba55033e-0bef-4588-bdac-820122c4fd4e-config\") pod \"kube-controller-manager-operator-78b949d7b-lbflg\" (UID: \"ba55033e-0bef-4588-bdac-820122c4fd4e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.004776 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.024591 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.044146 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.064677 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.085127 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.105328 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.172805 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vhrm\" (UniqueName: \"kubernetes.io/projected/b6539e39-b08d-4c27-a689-6401b299e123-kube-api-access-2vhrm\") pod \"oauth-openshift-558db77b4-gvsws\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.193252 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5tmd\" (UniqueName: \"kubernetes.io/projected/f8a0815a-ba28-4e47-ba48-2b6e9270a3d8-kube-api-access-n5tmd\") pod \"apiserver-76f77b778f-mlpn9\" (UID: \"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8\") " pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.209854 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rfpn\" (UniqueName: \"kubernetes.io/projected/bc5f4140-1f56-472e-95ed-cf3d4fb85f45-kube-api-access-9rfpn\") pod \"machine-api-operator-5694c8668f-knlnk\" (UID: \"bc5f4140-1f56-472e-95ed-cf3d4fb85f45\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.225905 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlxns\" (UniqueName: \"kubernetes.io/projected/ce0c606d-4062-4f6a-afec-752440b5580c-kube-api-access-wlxns\") pod \"console-f9d7485db-jbznq\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.249022 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.255116 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pdlf\" (UniqueName: \"kubernetes.io/projected/f4dff75c-23a2-40e2-9259-e056682367d7-kube-api-access-2pdlf\") pod \"controller-manager-879f6c89f-2qjrb\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.271281 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9lr4\" (UniqueName: \"kubernetes.io/projected/76842663-4197-4c71-8601-6a657814388b-kube-api-access-q9lr4\") pod \"route-controller-manager-6576b87f9c-mqg2t\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.285348 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.285377 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.292956 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j96z\" (UniqueName: \"kubernetes.io/projected/ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd-kube-api-access-6j96z\") pod \"cluster-samples-operator-665b6dd947-22fnv\" (UID: \"ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.298057 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.305336 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.325279 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.363887 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.366739 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dqqw\" (UniqueName: \"kubernetes.io/projected/5b94a42e-d2ac-46e7-a400-3702e7f5f261-kube-api-access-8dqqw\") pod \"machine-approver-56656f9798-x24fx\" (UID: \"5b94a42e-d2ac-46e7-a400-3702e7f5f261\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.385385 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.412687 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.426142 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.444496 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.463202 4974 request.go:700] Waited for 1.005164264s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-tls&limit=500&resourceVersion=0 Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.464506 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.484805 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.485821 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.492409 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.501232 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.508014 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.517610 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.524868 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.533516 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.548325 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.550363 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv"] Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.564922 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.588356 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.601999 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gvsws"] Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.604156 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.629986 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-jbznq"] Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.631574 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Oct 13 18:16:46 crc kubenswrapper[4974]: W1013 18:16:46.640735 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6539e39_b08d_4c27_a689_6401b299e123.slice/crio-9ec44c258057023ffe8640f3e778c4d40eadc0a609da5c3f921f605c67c78167 WatchSource:0}: Error finding container 9ec44c258057023ffe8640f3e778c4d40eadc0a609da5c3f921f605c67c78167: Status 404 returned error can't find the container with id 9ec44c258057023ffe8640f3e778c4d40eadc0a609da5c3f921f605c67c78167 Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.645220 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Oct 13 18:16:46 crc kubenswrapper[4974]: W1013 18:16:46.659364 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce0c606d_4062_4f6a_afec_752440b5580c.slice/crio-d61528ed40ea24156af1fd58000ad279a52a6e8611f08b4ed67b2792a3c1c103 WatchSource:0}: Error finding container d61528ed40ea24156af1fd58000ad279a52a6e8611f08b4ed67b2792a3c1c103: Status 404 returned error can't find the container with id d61528ed40ea24156af1fd58000ad279a52a6e8611f08b4ed67b2792a3c1c103 Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.664685 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.685127 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.703995 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.726687 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.743571 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-mlpn9"] Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.745329 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.770604 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.784205 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.804766 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.824204 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Oct 13 18:16:46 crc kubenswrapper[4974]: W1013 18:16:46.834133 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8a0815a_ba28_4e47_ba48_2b6e9270a3d8.slice/crio-93e7091bd2f10c8666e8e672117ebef34439b3039c47cbd8381101fceebebe08 WatchSource:0}: Error finding container 93e7091bd2f10c8666e8e672117ebef34439b3039c47cbd8381101fceebebe08: Status 404 returned error can't find the container with id 93e7091bd2f10c8666e8e672117ebef34439b3039c47cbd8381101fceebebe08 Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.843897 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.863394 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.892044 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.907221 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.924325 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.944464 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Oct 13 18:16:46 crc kubenswrapper[4974]: I1013 18:16:46.964532 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.027669 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.031543 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.031588 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.062793 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.072925 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.083071 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t"] Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.084842 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.086719 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2qjrb"] Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.089020 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-knlnk"] Oct 13 18:16:47 crc kubenswrapper[4974]: W1013 18:16:47.102622 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4dff75c_23a2_40e2_9259_e056682367d7.slice/crio-f9c0968f052effaeb17178c794569f72fb74c52a1c7c264c795f0081566d118a WatchSource:0}: Error finding container f9c0968f052effaeb17178c794569f72fb74c52a1c7c264c795f0081566d118a: Status 404 returned error can't find the container with id f9c0968f052effaeb17178c794569f72fb74c52a1c7c264c795f0081566d118a Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.103890 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.124431 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.146079 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.164820 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.185192 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.204567 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.224179 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.244125 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.284113 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6tvd\" (UniqueName: \"kubernetes.io/projected/4b386d31-0a67-4d5a-910d-79a22339feaf-kube-api-access-m6tvd\") pod \"cluster-image-registry-operator-dc59b4c8b-np2sc\" (UID: \"4b386d31-0a67-4d5a-910d-79a22339feaf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.303910 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b386d31-0a67-4d5a-910d-79a22339feaf-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-np2sc\" (UID: \"4b386d31-0a67-4d5a-910d-79a22339feaf\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.324244 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2ffe8fb4-5a71-4ef6-aef6-96603b8e7551-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-m7pjr\" (UID: \"2ffe8fb4-5a71-4ef6-aef6-96603b8e7551\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.343882 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfwp5\" (UniqueName: \"kubernetes.io/projected/820c88f3-f632-4375-b569-d796628c8f73-kube-api-access-rfwp5\") pod \"authentication-operator-69f744f599-khv5p\" (UID: \"820c88f3-f632-4375-b569-d796628c8f73\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.367808 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqdwk\" (UniqueName: \"kubernetes.io/projected/c139dd79-4db7-4c96-83af-378886acfcf8-kube-api-access-qqdwk\") pod \"openshift-apiserver-operator-796bbdcf4f-b27hb\" (UID: \"c139dd79-4db7-4c96-83af-378886acfcf8\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.379119 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ba55033e-0bef-4588-bdac-820122c4fd4e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-lbflg\" (UID: \"ba55033e-0bef-4588-bdac-820122c4fd4e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.380666 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.384478 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.386383 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.393746 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.405211 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.425088 4974 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.445078 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.465118 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.484100 4974 request.go:700] Waited for 1.967788395s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dnode-bootstrapper-token&limit=500&resourceVersion=0 Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.486341 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.505821 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.525200 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.545552 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.568392 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.603528 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr"] Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.606633 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.607211 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" event={"ID":"76842663-4197-4c71-8601-6a657814388b","Type":"ContainerStarted","Data":"dbb989c50e965a36408c384d0e26c3a7a0f1425d5c5c34894fa15048ad73f0f1"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.607245 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" event={"ID":"76842663-4197-4c71-8601-6a657814388b","Type":"ContainerStarted","Data":"71760778a44f71049938f941216309eccb230aa231c48a0a8748e7d9595a6bf2"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.607409 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.610279 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv" event={"ID":"ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd","Type":"ContainerStarted","Data":"806113dab9701d89a4c089c51da7d8dc9e239d759d7062fac310178bf7628989"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.610300 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv" event={"ID":"ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd","Type":"ContainerStarted","Data":"39e4432dc5e94bb853d770a74a4d1c3228b2ebfff9e4d80ada70d47d772dc2d1"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.610312 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv" event={"ID":"ad1f1b0b-26b8-4258-ae8a-defc9e04e0cd","Type":"ContainerStarted","Data":"13111b8d93c6b6c4d8d235df60ef06138b0936806ddc85bfa5008ad5bdc74187"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.612436 4974 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-mqg2t container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.612478 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" podUID="76842663-4197-4c71-8601-6a657814388b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.10:8443/healthz\": dial tcp 10.217.0.10:8443: connect: connection refused" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.619840 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" event={"ID":"f4dff75c-23a2-40e2-9259-e056682367d7","Type":"ContainerStarted","Data":"dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.620216 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" event={"ID":"f4dff75c-23a2-40e2-9259-e056682367d7","Type":"ContainerStarted","Data":"f9c0968f052effaeb17178c794569f72fb74c52a1c7c264c795f0081566d118a"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.621136 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.626228 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" event={"ID":"b6539e39-b08d-4c27-a689-6401b299e123","Type":"ContainerStarted","Data":"67e77fd01430d95f14c33f48443e369110278cc281f005e9882d6a78ff6a8215"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.626254 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" event={"ID":"b6539e39-b08d-4c27-a689-6401b299e123","Type":"ContainerStarted","Data":"9ec44c258057023ffe8640f3e778c4d40eadc0a609da5c3f921f605c67c78167"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.627343 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.627512 4974 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-2qjrb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.627555 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" podUID="f4dff75c-23a2-40e2-9259-e056682367d7" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.628469 4974 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-gvsws container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.628488 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" podUID="b6539e39-b08d-4c27-a689-6401b299e123" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.643028 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc"] Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.643533 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.648783 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" event={"ID":"5b94a42e-d2ac-46e7-a400-3702e7f5f261","Type":"ContainerStarted","Data":"7ebb1254334447f93c045e63d3ca145f256e65ea3a1a5e33c06c8eba8b6fb820"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.648823 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" event={"ID":"5b94a42e-d2ac-46e7-a400-3702e7f5f261","Type":"ContainerStarted","Data":"8dfc2479b1ac872328e9ea2856425d408d917914323bccb6db5c5cbb06b3e4b6"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.648833 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" event={"ID":"5b94a42e-d2ac-46e7-a400-3702e7f5f261","Type":"ContainerStarted","Data":"a2c9ccff1880feaf1e9545eb13aa273e9ef7baa66df4072debf1e9529da29122"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.651301 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" event={"ID":"bc5f4140-1f56-472e-95ed-cf3d4fb85f45","Type":"ContainerStarted","Data":"c1007e2094defd39d982d359ae807c1a646c21bcc339c0455b85d4a0746ccc3d"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.651325 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" event={"ID":"bc5f4140-1f56-472e-95ed-cf3d4fb85f45","Type":"ContainerStarted","Data":"3bf0cb9ef355125be4736a4b2c5b88ba08e46e1c845baaed7228956c31077495"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.651335 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" event={"ID":"bc5f4140-1f56-472e-95ed-cf3d4fb85f45","Type":"ContainerStarted","Data":"f690c29a913e1ffdf62e2f31a20f3f911b9f641b2a0b130169fc9bd397cee2b4"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.654269 4974 generic.go:334] "Generic (PLEG): container finished" podID="f8a0815a-ba28-4e47-ba48-2b6e9270a3d8" containerID="f87dbed0bdf9d109f74f9cde7944debad1b260b5dde571309c5e1c42017a3fb4" exitCode=0 Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.655372 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" event={"ID":"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8","Type":"ContainerDied","Data":"f87dbed0bdf9d109f74f9cde7944debad1b260b5dde571309c5e1c42017a3fb4"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.655401 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" event={"ID":"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8","Type":"ContainerStarted","Data":"93e7091bd2f10c8666e8e672117ebef34439b3039c47cbd8381101fceebebe08"} Oct 13 18:16:47 crc kubenswrapper[4974]: W1013 18:16:47.656181 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b386d31_0a67_4d5a_910d_79a22339feaf.slice/crio-9d8d7529be56509f1c59e868b0c2c378eb5ca9f8eab65e6e217c14db115bf5c1 WatchSource:0}: Error finding container 9d8d7529be56509f1c59e868b0c2c378eb5ca9f8eab65e6e217c14db115bf5c1: Status 404 returned error can't find the container with id 9d8d7529be56509f1c59e868b0c2c378eb5ca9f8eab65e6e217c14db115bf5c1 Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.661925 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jbznq" event={"ID":"ce0c606d-4062-4f6a-afec-752440b5580c","Type":"ContainerStarted","Data":"f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.661968 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jbznq" event={"ID":"ce0c606d-4062-4f6a-afec-752440b5580c","Type":"ContainerStarted","Data":"d61528ed40ea24156af1fd58000ad279a52a6e8611f08b4ed67b2792a3c1c103"} Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679519 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-registry-tls\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679559 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c739a775-23db-45ab-b1d0-4860b2353cab-proxy-tls\") pod \"machine-config-controller-84d6567774-9j6ns\" (UID: \"c739a775-23db-45ab-b1d0-4860b2353cab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679596 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679625 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c22cg\" (UniqueName: \"kubernetes.io/projected/57abf354-bc90-422f-a35d-c0185fa85ca1-kube-api-access-c22cg\") pod \"dns-operator-744455d44c-dsjgz\" (UID: \"57abf354-bc90-422f-a35d-c0185fa85ca1\") " pod="openshift-dns-operator/dns-operator-744455d44c-dsjgz" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679641 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6485l\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-kube-api-access-6485l\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679671 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9ce9e429-c79c-41d2-82d6-f3a432fd3173-audit-policies\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679685 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9ce9e429-c79c-41d2-82d6-f3a432fd3173-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679739 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40c4254f-5c3d-4655-82f9-49fd9510339a-trusted-ca\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679755 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86c5n\" (UniqueName: \"kubernetes.io/projected/e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74-kube-api-access-86c5n\") pod \"openshift-config-operator-7777fb866f-nrzsg\" (UID: \"e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679776 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e713c27c-e93f-44ce-8e9a-d908099b699c-stats-auth\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679802 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c739a775-23db-45ab-b1d0-4860b2353cab-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9j6ns\" (UID: \"c739a775-23db-45ab-b1d0-4860b2353cab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679817 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2flzl\" (UniqueName: \"kubernetes.io/projected/9ce9e429-c79c-41d2-82d6-f3a432fd3173-kube-api-access-2flzl\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679842 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73a7134a-5187-4b6e-aa08-40146a99e93e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bpshp\" (UID: \"73a7134a-5187-4b6e-aa08-40146a99e93e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679856 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xthtl\" (UniqueName: \"kubernetes.io/projected/c739a775-23db-45ab-b1d0-4860b2353cab-kube-api-access-xthtl\") pod \"machine-config-controller-84d6567774-9j6ns\" (UID: \"c739a775-23db-45ab-b1d0-4860b2353cab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679871 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e713c27c-e93f-44ce-8e9a-d908099b699c-service-ca-bundle\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679889 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74-serving-cert\") pod \"openshift-config-operator-7777fb866f-nrzsg\" (UID: \"e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679904 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a933da9-d7fc-40a5-ae28-4fb8e641722d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lbfzq\" (UID: \"6a933da9-d7fc-40a5-ae28-4fb8e641722d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679927 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9ce9e429-c79c-41d2-82d6-f3a432fd3173-audit-dir\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679952 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73a7134a-5187-4b6e-aa08-40146a99e93e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bpshp\" (UID: \"73a7134a-5187-4b6e-aa08-40146a99e93e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679966 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40c4254f-5c3d-4655-82f9-49fd9510339a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.679988 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40c4254f-5c3d-4655-82f9-49fd9510339a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680003 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkhg5\" (UniqueName: \"kubernetes.io/projected/e713c27c-e93f-44ce-8e9a-d908099b699c-kube-api-access-xkhg5\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680017 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/88ad4d09-fc16-4a33-90de-64fe96f64c12-etcd-service-ca\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680032 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a933da9-d7fc-40a5-ae28-4fb8e641722d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lbfzq\" (UID: \"6a933da9-d7fc-40a5-ae28-4fb8e641722d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680048 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l29rd\" (UniqueName: \"kubernetes.io/projected/9afdbe18-a37f-4678-9552-026d1d55946d-kube-api-access-l29rd\") pod \"console-operator-58897d9998-tp4tf\" (UID: \"9afdbe18-a37f-4678-9552-026d1d55946d\") " pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680064 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9ce9e429-c79c-41d2-82d6-f3a432fd3173-encryption-config\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680107 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9afdbe18-a37f-4678-9552-026d1d55946d-serving-cert\") pod \"console-operator-58897d9998-tp4tf\" (UID: \"9afdbe18-a37f-4678-9552-026d1d55946d\") " pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680122 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57abf354-bc90-422f-a35d-c0185fa85ca1-metrics-tls\") pod \"dns-operator-744455d44c-dsjgz\" (UID: \"57abf354-bc90-422f-a35d-c0185fa85ca1\") " pod="openshift-dns-operator/dns-operator-744455d44c-dsjgz" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680137 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7sxk\" (UniqueName: \"kubernetes.io/projected/34b64103-310c-4c28-80ca-decee688ee64-kube-api-access-c7sxk\") pod \"downloads-7954f5f757-rr898\" (UID: \"34b64103-310c-4c28-80ca-decee688ee64\") " pod="openshift-console/downloads-7954f5f757-rr898" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680163 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9afdbe18-a37f-4678-9552-026d1d55946d-trusted-ca\") pod \"console-operator-58897d9998-tp4tf\" (UID: \"9afdbe18-a37f-4678-9552-026d1d55946d\") " pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680177 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a933da9-d7fc-40a5-ae28-4fb8e641722d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lbfzq\" (UID: \"6a933da9-d7fc-40a5-ae28-4fb8e641722d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680208 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40c4254f-5c3d-4655-82f9-49fd9510339a-registry-certificates\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680224 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ll5n\" (UniqueName: \"kubernetes.io/projected/73a7134a-5187-4b6e-aa08-40146a99e93e-kube-api-access-7ll5n\") pod \"openshift-controller-manager-operator-756b6f6bc6-bpshp\" (UID: \"73a7134a-5187-4b6e-aa08-40146a99e93e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680239 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9afdbe18-a37f-4678-9552-026d1d55946d-config\") pod \"console-operator-58897d9998-tp4tf\" (UID: \"9afdbe18-a37f-4678-9552-026d1d55946d\") " pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680252 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nrzsg\" (UID: \"e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680279 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e713c27c-e93f-44ce-8e9a-d908099b699c-metrics-certs\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680294 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ce9e429-c79c-41d2-82d6-f3a432fd3173-serving-cert\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680310 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ce9e429-c79c-41d2-82d6-f3a432fd3173-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680340 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fchnm\" (UniqueName: \"kubernetes.io/projected/88ad4d09-fc16-4a33-90de-64fe96f64c12-kube-api-access-fchnm\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680375 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9ce9e429-c79c-41d2-82d6-f3a432fd3173-etcd-client\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680394 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/88ad4d09-fc16-4a33-90de-64fe96f64c12-etcd-client\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680414 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-bound-sa-token\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680435 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e713c27c-e93f-44ce-8e9a-d908099b699c-default-certificate\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680457 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88ad4d09-fc16-4a33-90de-64fe96f64c12-serving-cert\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680475 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88ad4d09-fc16-4a33-90de-64fe96f64c12-config\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.680497 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/88ad4d09-fc16-4a33-90de-64fe96f64c12-etcd-ca\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: E1013 18:16:47.681611 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:48.181600009 +0000 UTC m=+143.085966089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.690613 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg"] Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.781938 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.782122 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjfjm\" (UniqueName: \"kubernetes.io/projected/78a93fc9-5305-44ea-a573-3e54bd52f22d-kube-api-access-wjfjm\") pod \"control-plane-machine-set-operator-78cbb6b69f-h6bwk\" (UID: \"78a93fc9-5305-44ea-a573-3e54bd52f22d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk" Oct 13 18:16:47 crc kubenswrapper[4974]: E1013 18:16:47.782176 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:48.282129669 +0000 UTC m=+143.186495799 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.785240 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dv84\" (UniqueName: \"kubernetes.io/projected/11322ed1-eb0b-4d64-890d-2f4948c102a7-kube-api-access-8dv84\") pod \"service-ca-operator-777779d784-lvmtc\" (UID: \"11322ed1-eb0b-4d64-890d-2f4948c102a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.785302 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e713c27c-e93f-44ce-8e9a-d908099b699c-metrics-certs\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.785322 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ce9e429-c79c-41d2-82d6-f3a432fd3173-serving-cert\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.785345 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ce9e429-c79c-41d2-82d6-f3a432fd3173-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.785409 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3593c403-0e2d-4cee-a2fa-8df43371df5a-srv-cert\") pod \"catalog-operator-68c6474976-bhw5m\" (UID: \"3593c403-0e2d-4cee-a2fa-8df43371df5a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.785452 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fchnm\" (UniqueName: \"kubernetes.io/projected/88ad4d09-fc16-4a33-90de-64fe96f64c12-kube-api-access-fchnm\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.785470 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53c8c837-3981-467c-92ae-2f0fd6cfbb48-config-volume\") pod \"dns-default-fgm8n\" (UID: \"53c8c837-3981-467c-92ae-2f0fd6cfbb48\") " pod="openshift-dns/dns-default-fgm8n" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.785520 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/88ad4d09-fc16-4a33-90de-64fe96f64c12-etcd-client\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.785539 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-mountpoint-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.785568 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b930522f-1b72-4d47-a50e-883ed2c2facb-trusted-ca\") pod \"ingress-operator-5b745b69d9-dwl4j\" (UID: \"b930522f-1b72-4d47-a50e-883ed2c2facb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.785603 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-bound-sa-token\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.785882 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6wcx\" (UniqueName: \"kubernetes.io/projected/18bd98c7-66c4-47fc-b2e4-b53134a191b8-kube-api-access-m6wcx\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mxlt\" (UID: \"18bd98c7-66c4-47fc-b2e4-b53134a191b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786020 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88ad4d09-fc16-4a33-90de-64fe96f64c12-serving-cert\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786055 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh7fr\" (UniqueName: \"kubernetes.io/projected/b930522f-1b72-4d47-a50e-883ed2c2facb-kube-api-access-hh7fr\") pod \"ingress-operator-5b745b69d9-dwl4j\" (UID: \"b930522f-1b72-4d47-a50e-883ed2c2facb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786186 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-registry-tls\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786231 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c739a775-23db-45ab-b1d0-4860b2353cab-proxy-tls\") pod \"machine-config-controller-84d6567774-9j6ns\" (UID: \"c739a775-23db-45ab-b1d0-4860b2353cab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786279 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53800734-bd36-4fd6-9ed9-f69208f973b2-cert\") pod \"ingress-canary-xfvq4\" (UID: \"53800734-bd36-4fd6-9ed9-f69208f973b2\") " pod="openshift-ingress-canary/ingress-canary-xfvq4" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786336 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2qqc\" (UniqueName: \"kubernetes.io/projected/fa74b416-3f9b-45de-a657-79a31f755b9c-kube-api-access-g2qqc\") pod \"marketplace-operator-79b997595-wph2s\" (UID: \"fa74b416-3f9b-45de-a657-79a31f755b9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786402 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6485l\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-kube-api-access-6485l\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786426 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-socket-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786462 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b317e9fb-3b83-47f2-971b-7e509a59782f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5mjnq\" (UID: \"b317e9fb-3b83-47f2-971b-7e509a59782f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786489 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-csi-data-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786550 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86c5n\" (UniqueName: \"kubernetes.io/projected/e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74-kube-api-access-86c5n\") pod \"openshift-config-operator-7777fb866f-nrzsg\" (UID: \"e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786716 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2flzl\" (UniqueName: \"kubernetes.io/projected/9ce9e429-c79c-41d2-82d6-f3a432fd3173-kube-api-access-2flzl\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786776 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73a7134a-5187-4b6e-aa08-40146a99e93e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bpshp\" (UID: \"73a7134a-5187-4b6e-aa08-40146a99e93e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786799 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xthtl\" (UniqueName: \"kubernetes.io/projected/c739a775-23db-45ab-b1d0-4860b2353cab-kube-api-access-xthtl\") pod \"machine-config-controller-84d6567774-9j6ns\" (UID: \"c739a775-23db-45ab-b1d0-4860b2353cab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786855 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74-serving-cert\") pod \"openshift-config-operator-7777fb866f-nrzsg\" (UID: \"e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786876 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18bd98c7-66c4-47fc-b2e4-b53134a191b8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mxlt\" (UID: \"18bd98c7-66c4-47fc-b2e4-b53134a191b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786897 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/73685109-d1de-4e40-8bfe-3b817aaadabd-tmpfs\") pod \"packageserver-d55dfcdfc-9b6fg\" (UID: \"73685109-d1de-4e40-8bfe-3b817aaadabd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786941 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3593c403-0e2d-4cee-a2fa-8df43371df5a-profile-collector-cert\") pod \"catalog-operator-68c6474976-bhw5m\" (UID: \"3593c403-0e2d-4cee-a2fa-8df43371df5a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.786965 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73a7134a-5187-4b6e-aa08-40146a99e93e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bpshp\" (UID: \"73a7134a-5187-4b6e-aa08-40146a99e93e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787019 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40c4254f-5c3d-4655-82f9-49fd9510339a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787074 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/88ad4d09-fc16-4a33-90de-64fe96f64c12-etcd-service-ca\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787098 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-registration-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787127 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9ce9e429-c79c-41d2-82d6-f3a432fd3173-encryption-config\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787319 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wtbj\" (UniqueName: \"kubernetes.io/projected/83624c06-f2ff-4706-b4cd-1d41edb91898-kube-api-access-5wtbj\") pod \"migrator-59844c95c7-kkk6m\" (UID: \"83624c06-f2ff-4706-b4cd-1d41edb91898\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kkk6m" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787343 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b317e9fb-3b83-47f2-971b-7e509a59782f-proxy-tls\") pod \"machine-config-operator-74547568cd-5mjnq\" (UID: \"b317e9fb-3b83-47f2-971b-7e509a59782f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787385 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b930522f-1b72-4d47-a50e-883ed2c2facb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dwl4j\" (UID: \"b930522f-1b72-4d47-a50e-883ed2c2facb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787430 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7sxk\" (UniqueName: \"kubernetes.io/projected/34b64103-310c-4c28-80ca-decee688ee64-kube-api-access-c7sxk\") pod \"downloads-7954f5f757-rr898\" (UID: \"34b64103-310c-4c28-80ca-decee688ee64\") " pod="openshift-console/downloads-7954f5f757-rr898" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787623 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9kth\" (UniqueName: \"kubernetes.io/projected/b317e9fb-3b83-47f2-971b-7e509a59782f-kube-api-access-x9kth\") pod \"machine-config-operator-74547568cd-5mjnq\" (UID: \"b317e9fb-3b83-47f2-971b-7e509a59782f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787712 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9afdbe18-a37f-4678-9552-026d1d55946d-trusted-ca\") pod \"console-operator-58897d9998-tp4tf\" (UID: \"9afdbe18-a37f-4678-9552-026d1d55946d\") " pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787731 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b930522f-1b72-4d47-a50e-883ed2c2facb-metrics-tls\") pod \"ingress-operator-5b745b69d9-dwl4j\" (UID: \"b930522f-1b72-4d47-a50e-883ed2c2facb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787810 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fa74b416-3f9b-45de-a657-79a31f755b9c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wph2s\" (UID: \"fa74b416-3f9b-45de-a657-79a31f755b9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787851 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7429c3e4-2ad8-4373-807a-b69a11868c49-secret-volume\") pod \"collect-profiles-29339655-svflh\" (UID: \"7429c3e4-2ad8-4373-807a-b69a11868c49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787914 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40c4254f-5c3d-4655-82f9-49fd9510339a-registry-certificates\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787939 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ll5n\" (UniqueName: \"kubernetes.io/projected/73a7134a-5187-4b6e-aa08-40146a99e93e-kube-api-access-7ll5n\") pod \"openshift-controller-manager-operator-756b6f6bc6-bpshp\" (UID: \"73a7134a-5187-4b6e-aa08-40146a99e93e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.787969 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9afdbe18-a37f-4678-9552-026d1d55946d-config\") pod \"console-operator-58897d9998-tp4tf\" (UID: \"9afdbe18-a37f-4678-9552-026d1d55946d\") " pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.788464 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nrzsg\" (UID: \"e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.788489 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/73685109-d1de-4e40-8bfe-3b817aaadabd-apiservice-cert\") pod \"packageserver-d55dfcdfc-9b6fg\" (UID: \"73685109-d1de-4e40-8bfe-3b817aaadabd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.788568 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-plugins-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.788635 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pb54\" (UniqueName: \"kubernetes.io/projected/3593c403-0e2d-4cee-a2fa-8df43371df5a-kube-api-access-4pb54\") pod \"catalog-operator-68c6474976-bhw5m\" (UID: \"3593c403-0e2d-4cee-a2fa-8df43371df5a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.788685 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b317e9fb-3b83-47f2-971b-7e509a59782f-images\") pod \"machine-config-operator-74547568cd-5mjnq\" (UID: \"b317e9fb-3b83-47f2-971b-7e509a59782f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.788714 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/73685109-d1de-4e40-8bfe-3b817aaadabd-webhook-cert\") pod \"packageserver-d55dfcdfc-9b6fg\" (UID: \"73685109-d1de-4e40-8bfe-3b817aaadabd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.788732 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vkml\" (UniqueName: \"kubernetes.io/projected/73685109-d1de-4e40-8bfe-3b817aaadabd-kube-api-access-2vkml\") pod \"packageserver-d55dfcdfc-9b6fg\" (UID: \"73685109-d1de-4e40-8bfe-3b817aaadabd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.788775 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53c8c837-3981-467c-92ae-2f0fd6cfbb48-metrics-tls\") pod \"dns-default-fgm8n\" (UID: \"53c8c837-3981-467c-92ae-2f0fd6cfbb48\") " pod="openshift-dns/dns-default-fgm8n" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.788887 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b9012940-31e8-4aeb-98a5-a08550e781b1-certs\") pod \"machine-config-server-7486c\" (UID: \"b9012940-31e8-4aeb-98a5-a08550e781b1\") " pod="openshift-machine-config-operator/machine-config-server-7486c" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.789026 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zbcz\" (UniqueName: \"kubernetes.io/projected/53c8c837-3981-467c-92ae-2f0fd6cfbb48-kube-api-access-4zbcz\") pod \"dns-default-fgm8n\" (UID: \"53c8c837-3981-467c-92ae-2f0fd6cfbb48\") " pod="openshift-dns/dns-default-fgm8n" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.789083 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9ce9e429-c79c-41d2-82d6-f3a432fd3173-etcd-client\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.789134 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11322ed1-eb0b-4d64-890d-2f4948c102a7-config\") pod \"service-ca-operator-777779d784-lvmtc\" (UID: \"11322ed1-eb0b-4d64-890d-2f4948c102a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.789181 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e713c27c-e93f-44ce-8e9a-d908099b699c-default-certificate\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794415 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88ad4d09-fc16-4a33-90de-64fe96f64c12-config\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794449 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/88ad4d09-fc16-4a33-90de-64fe96f64c12-etcd-ca\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794503 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa74b416-3f9b-45de-a657-79a31f755b9c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wph2s\" (UID: \"fa74b416-3f9b-45de-a657-79a31f755b9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794576 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794616 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b9012940-31e8-4aeb-98a5-a08550e781b1-node-bootstrap-token\") pod \"machine-config-server-7486c\" (UID: \"b9012940-31e8-4aeb-98a5-a08550e781b1\") " pod="openshift-machine-config-operator/machine-config-server-7486c" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794679 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c22cg\" (UniqueName: \"kubernetes.io/projected/57abf354-bc90-422f-a35d-c0185fa85ca1-kube-api-access-c22cg\") pod \"dns-operator-744455d44c-dsjgz\" (UID: \"57abf354-bc90-422f-a35d-c0185fa85ca1\") " pod="openshift-dns-operator/dns-operator-744455d44c-dsjgz" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794703 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/487f3a34-1886-4b8f-a9c3-b1d4f7702de2-signing-cabundle\") pod \"service-ca-9c57cc56f-rdq7v\" (UID: \"487f3a34-1886-4b8f-a9c3-b1d4f7702de2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794760 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9ce9e429-c79c-41d2-82d6-f3a432fd3173-audit-policies\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794781 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/487f3a34-1886-4b8f-a9c3-b1d4f7702de2-signing-key\") pod \"service-ca-9c57cc56f-rdq7v\" (UID: \"487f3a34-1886-4b8f-a9c3-b1d4f7702de2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794821 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9ce9e429-c79c-41d2-82d6-f3a432fd3173-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794843 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11322ed1-eb0b-4d64-890d-2f4948c102a7-serving-cert\") pod \"service-ca-operator-777779d784-lvmtc\" (UID: \"11322ed1-eb0b-4d64-890d-2f4948c102a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794865 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njwpj\" (UniqueName: \"kubernetes.io/projected/b9012940-31e8-4aeb-98a5-a08550e781b1-kube-api-access-njwpj\") pod \"machine-config-server-7486c\" (UID: \"b9012940-31e8-4aeb-98a5-a08550e781b1\") " pod="openshift-machine-config-operator/machine-config-server-7486c" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794914 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nt7q\" (UniqueName: \"kubernetes.io/projected/77988f90-8a72-4a03-9394-9f1971c21484-kube-api-access-2nt7q\") pod \"multus-admission-controller-857f4d67dd-4r245\" (UID: \"77988f90-8a72-4a03-9394-9f1971c21484\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4r245" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794977 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40c4254f-5c3d-4655-82f9-49fd9510339a-trusted-ca\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.794997 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chzdp\" (UniqueName: \"kubernetes.io/projected/78b85293-78d3-47e8-8546-7d625828b45a-kube-api-access-chzdp\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.795059 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e713c27c-e93f-44ce-8e9a-d908099b699c-stats-auth\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.795127 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c739a775-23db-45ab-b1d0-4860b2353cab-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9j6ns\" (UID: \"c739a775-23db-45ab-b1d0-4860b2353cab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.797478 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e713c27c-e93f-44ce-8e9a-d908099b699c-service-ca-bundle\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.798072 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a933da9-d7fc-40a5-ae28-4fb8e641722d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lbfzq\" (UID: \"6a933da9-d7fc-40a5-ae28-4fb8e641722d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.798923 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/77988f90-8a72-4a03-9394-9f1971c21484-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4r245\" (UID: \"77988f90-8a72-4a03-9394-9f1971c21484\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4r245" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.798953 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7429c3e4-2ad8-4373-807a-b69a11868c49-config-volume\") pod \"collect-profiles-29339655-svflh\" (UID: \"7429c3e4-2ad8-4373-807a-b69a11868c49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.798980 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5tgh\" (UniqueName: \"kubernetes.io/projected/53800734-bd36-4fd6-9ed9-f69208f973b2-kube-api-access-d5tgh\") pod \"ingress-canary-xfvq4\" (UID: \"53800734-bd36-4fd6-9ed9-f69208f973b2\") " pod="openshift-ingress-canary/ingress-canary-xfvq4" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799116 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9ce9e429-c79c-41d2-82d6-f3a432fd3173-audit-dir\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799161 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2zqr\" (UniqueName: \"kubernetes.io/projected/ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1-kube-api-access-p2zqr\") pod \"package-server-manager-789f6589d5-jtjwq\" (UID: \"ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799229 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40c4254f-5c3d-4655-82f9-49fd9510339a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799258 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkhg5\" (UniqueName: \"kubernetes.io/projected/e713c27c-e93f-44ce-8e9a-d908099b699c-kube-api-access-xkhg5\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799285 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a933da9-d7fc-40a5-ae28-4fb8e641722d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lbfzq\" (UID: \"6a933da9-d7fc-40a5-ae28-4fb8e641722d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799345 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l29rd\" (UniqueName: \"kubernetes.io/projected/9afdbe18-a37f-4678-9552-026d1d55946d-kube-api-access-l29rd\") pod \"console-operator-58897d9998-tp4tf\" (UID: \"9afdbe18-a37f-4678-9552-026d1d55946d\") " pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799371 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzdmp\" (UniqueName: \"kubernetes.io/projected/487f3a34-1886-4b8f-a9c3-b1d4f7702de2-kube-api-access-zzdmp\") pod \"service-ca-9c57cc56f-rdq7v\" (UID: \"487f3a34-1886-4b8f-a9c3-b1d4f7702de2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799404 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wckjr\" (UniqueName: \"kubernetes.io/projected/9041feac-829c-4d87-ade9-e09fd97b46ba-kube-api-access-wckjr\") pod \"olm-operator-6b444d44fb-q7r7s\" (UID: \"9041feac-829c-4d87-ade9-e09fd97b46ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799466 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9041feac-829c-4d87-ade9-e09fd97b46ba-srv-cert\") pod \"olm-operator-6b444d44fb-q7r7s\" (UID: \"9041feac-829c-4d87-ade9-e09fd97b46ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799517 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9afdbe18-a37f-4678-9552-026d1d55946d-serving-cert\") pod \"console-operator-58897d9998-tp4tf\" (UID: \"9afdbe18-a37f-4678-9552-026d1d55946d\") " pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799541 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57abf354-bc90-422f-a35d-c0185fa85ca1-metrics-tls\") pod \"dns-operator-744455d44c-dsjgz\" (UID: \"57abf354-bc90-422f-a35d-c0185fa85ca1\") " pod="openshift-dns-operator/dns-operator-744455d44c-dsjgz" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799565 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jtjwq\" (UID: \"ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799636 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a933da9-d7fc-40a5-ae28-4fb8e641722d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lbfzq\" (UID: \"6a933da9-d7fc-40a5-ae28-4fb8e641722d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799699 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knkc8\" (UniqueName: \"kubernetes.io/projected/7429c3e4-2ad8-4373-807a-b69a11868c49-kube-api-access-knkc8\") pod \"collect-profiles-29339655-svflh\" (UID: \"7429c3e4-2ad8-4373-807a-b69a11868c49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799743 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bd98c7-66c4-47fc-b2e4-b53134a191b8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mxlt\" (UID: \"18bd98c7-66c4-47fc-b2e4-b53134a191b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799772 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9041feac-829c-4d87-ade9-e09fd97b46ba-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q7r7s\" (UID: \"9041feac-829c-4d87-ade9-e09fd97b46ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799778 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9afdbe18-a37f-4678-9552-026d1d55946d-trusted-ca\") pod \"console-operator-58897d9998-tp4tf\" (UID: \"9afdbe18-a37f-4678-9552-026d1d55946d\") " pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.799813 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/78a93fc9-5305-44ea-a573-3e54bd52f22d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h6bwk\" (UID: \"78a93fc9-5305-44ea-a573-3e54bd52f22d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.800114 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e713c27c-e93f-44ce-8e9a-d908099b699c-metrics-certs\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.800549 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ce9e429-c79c-41d2-82d6-f3a432fd3173-serving-cert\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.801232 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/88ad4d09-fc16-4a33-90de-64fe96f64c12-serving-cert\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.802241 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40c4254f-5c3d-4655-82f9-49fd9510339a-registry-certificates\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.802620 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/88ad4d09-fc16-4a33-90de-64fe96f64c12-etcd-client\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.803578 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9ce9e429-c79c-41d2-82d6-f3a432fd3173-audit-dir\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.805387 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40c4254f-5c3d-4655-82f9-49fd9510339a-trusted-ca\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.805857 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-registry-tls\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.807297 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e713c27c-e93f-44ce-8e9a-d908099b699c-default-certificate\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.808484 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9afdbe18-a37f-4678-9552-026d1d55946d-config\") pod \"console-operator-58897d9998-tp4tf\" (UID: \"9afdbe18-a37f-4678-9552-026d1d55946d\") " pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.808586 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a933da9-d7fc-40a5-ae28-4fb8e641722d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lbfzq\" (UID: \"6a933da9-d7fc-40a5-ae28-4fb8e641722d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.808865 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74-available-featuregates\") pod \"openshift-config-operator-7777fb866f-nrzsg\" (UID: \"e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.810566 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c739a775-23db-45ab-b1d0-4860b2353cab-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-9j6ns\" (UID: \"c739a775-23db-45ab-b1d0-4860b2353cab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.811944 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/9ce9e429-c79c-41d2-82d6-f3a432fd3173-encryption-config\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.812365 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/73a7134a-5187-4b6e-aa08-40146a99e93e-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-bpshp\" (UID: \"73a7134a-5187-4b6e-aa08-40146a99e93e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.812521 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40c4254f-5c3d-4655-82f9-49fd9510339a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.814328 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/88ad4d09-fc16-4a33-90de-64fe96f64c12-config\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.814495 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/88ad4d09-fc16-4a33-90de-64fe96f64c12-etcd-service-ca\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: E1013 18:16:47.815239 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:48.315218117 +0000 UTC m=+143.219584207 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.815294 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9ce9e429-c79c-41d2-82d6-f3a432fd3173-etcd-client\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.815601 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c739a775-23db-45ab-b1d0-4860b2353cab-proxy-tls\") pod \"machine-config-controller-84d6567774-9j6ns\" (UID: \"c739a775-23db-45ab-b1d0-4860b2353cab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.815811 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40c4254f-5c3d-4655-82f9-49fd9510339a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.817425 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e713c27c-e93f-44ce-8e9a-d908099b699c-stats-auth\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.818914 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9ce9e429-c79c-41d2-82d6-f3a432fd3173-audit-policies\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.819180 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e713c27c-e93f-44ce-8e9a-d908099b699c-service-ca-bundle\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.819545 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/88ad4d09-fc16-4a33-90de-64fe96f64c12-etcd-ca\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.819975 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/9ce9e429-c79c-41d2-82d6-f3a432fd3173-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.821242 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ll5n\" (UniqueName: \"kubernetes.io/projected/73a7134a-5187-4b6e-aa08-40146a99e93e-kube-api-access-7ll5n\") pod \"openshift-controller-manager-operator-756b6f6bc6-bpshp\" (UID: \"73a7134a-5187-4b6e-aa08-40146a99e93e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.823034 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ce9e429-c79c-41d2-82d6-f3a432fd3173-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.823606 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74-serving-cert\") pod \"openshift-config-operator-7777fb866f-nrzsg\" (UID: \"e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.824231 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/57abf354-bc90-422f-a35d-c0185fa85ca1-metrics-tls\") pod \"dns-operator-744455d44c-dsjgz\" (UID: \"57abf354-bc90-422f-a35d-c0185fa85ca1\") " pod="openshift-dns-operator/dns-operator-744455d44c-dsjgz" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.824625 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/73a7134a-5187-4b6e-aa08-40146a99e93e-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-bpshp\" (UID: \"73a7134a-5187-4b6e-aa08-40146a99e93e\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.825930 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a933da9-d7fc-40a5-ae28-4fb8e641722d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lbfzq\" (UID: \"6a933da9-d7fc-40a5-ae28-4fb8e641722d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.836046 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9afdbe18-a37f-4678-9552-026d1d55946d-serving-cert\") pod \"console-operator-58897d9998-tp4tf\" (UID: \"9afdbe18-a37f-4678-9552-026d1d55946d\") " pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.838493 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-bound-sa-token\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.867466 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2flzl\" (UniqueName: \"kubernetes.io/projected/9ce9e429-c79c-41d2-82d6-f3a432fd3173-kube-api-access-2flzl\") pod \"apiserver-7bbb656c7d-sx8k7\" (UID: \"9ce9e429-c79c-41d2-82d6-f3a432fd3173\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.892556 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-khv5p"] Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.896282 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86c5n\" (UniqueName: \"kubernetes.io/projected/e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74-kube-api-access-86c5n\") pod \"openshift-config-operator-7777fb866f-nrzsg\" (UID: \"e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902093 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902298 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knkc8\" (UniqueName: \"kubernetes.io/projected/7429c3e4-2ad8-4373-807a-b69a11868c49-kube-api-access-knkc8\") pod \"collect-profiles-29339655-svflh\" (UID: \"7429c3e4-2ad8-4373-807a-b69a11868c49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902334 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bd98c7-66c4-47fc-b2e4-b53134a191b8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mxlt\" (UID: \"18bd98c7-66c4-47fc-b2e4-b53134a191b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902361 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9041feac-829c-4d87-ade9-e09fd97b46ba-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q7r7s\" (UID: \"9041feac-829c-4d87-ade9-e09fd97b46ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902385 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/78a93fc9-5305-44ea-a573-3e54bd52f22d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h6bwk\" (UID: \"78a93fc9-5305-44ea-a573-3e54bd52f22d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902412 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjfjm\" (UniqueName: \"kubernetes.io/projected/78a93fc9-5305-44ea-a573-3e54bd52f22d-kube-api-access-wjfjm\") pod \"control-plane-machine-set-operator-78cbb6b69f-h6bwk\" (UID: \"78a93fc9-5305-44ea-a573-3e54bd52f22d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902436 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dv84\" (UniqueName: \"kubernetes.io/projected/11322ed1-eb0b-4d64-890d-2f4948c102a7-kube-api-access-8dv84\") pod \"service-ca-operator-777779d784-lvmtc\" (UID: \"11322ed1-eb0b-4d64-890d-2f4948c102a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902461 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3593c403-0e2d-4cee-a2fa-8df43371df5a-srv-cert\") pod \"catalog-operator-68c6474976-bhw5m\" (UID: \"3593c403-0e2d-4cee-a2fa-8df43371df5a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902489 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53c8c837-3981-467c-92ae-2f0fd6cfbb48-config-volume\") pod \"dns-default-fgm8n\" (UID: \"53c8c837-3981-467c-92ae-2f0fd6cfbb48\") " pod="openshift-dns/dns-default-fgm8n" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902518 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-mountpoint-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902540 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b930522f-1b72-4d47-a50e-883ed2c2facb-trusted-ca\") pod \"ingress-operator-5b745b69d9-dwl4j\" (UID: \"b930522f-1b72-4d47-a50e-883ed2c2facb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902562 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6wcx\" (UniqueName: \"kubernetes.io/projected/18bd98c7-66c4-47fc-b2e4-b53134a191b8-kube-api-access-m6wcx\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mxlt\" (UID: \"18bd98c7-66c4-47fc-b2e4-b53134a191b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902587 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh7fr\" (UniqueName: \"kubernetes.io/projected/b930522f-1b72-4d47-a50e-883ed2c2facb-kube-api-access-hh7fr\") pod \"ingress-operator-5b745b69d9-dwl4j\" (UID: \"b930522f-1b72-4d47-a50e-883ed2c2facb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902611 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53800734-bd36-4fd6-9ed9-f69208f973b2-cert\") pod \"ingress-canary-xfvq4\" (UID: \"53800734-bd36-4fd6-9ed9-f69208f973b2\") " pod="openshift-ingress-canary/ingress-canary-xfvq4" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902887 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2qqc\" (UniqueName: \"kubernetes.io/projected/fa74b416-3f9b-45de-a657-79a31f755b9c-kube-api-access-g2qqc\") pod \"marketplace-operator-79b997595-wph2s\" (UID: \"fa74b416-3f9b-45de-a657-79a31f755b9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902931 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-socket-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902954 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b317e9fb-3b83-47f2-971b-7e509a59782f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5mjnq\" (UID: \"b317e9fb-3b83-47f2-971b-7e509a59782f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.902975 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-csi-data-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.903013 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18bd98c7-66c4-47fc-b2e4-b53134a191b8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mxlt\" (UID: \"18bd98c7-66c4-47fc-b2e4-b53134a191b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.903040 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/73685109-d1de-4e40-8bfe-3b817aaadabd-tmpfs\") pod \"packageserver-d55dfcdfc-9b6fg\" (UID: \"73685109-d1de-4e40-8bfe-3b817aaadabd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.903097 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3593c403-0e2d-4cee-a2fa-8df43371df5a-profile-collector-cert\") pod \"catalog-operator-68c6474976-bhw5m\" (UID: \"3593c403-0e2d-4cee-a2fa-8df43371df5a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.903129 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-registration-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.903152 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wtbj\" (UniqueName: \"kubernetes.io/projected/83624c06-f2ff-4706-b4cd-1d41edb91898-kube-api-access-5wtbj\") pod \"migrator-59844c95c7-kkk6m\" (UID: \"83624c06-f2ff-4706-b4cd-1d41edb91898\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kkk6m" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.903174 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b317e9fb-3b83-47f2-971b-7e509a59782f-proxy-tls\") pod \"machine-config-operator-74547568cd-5mjnq\" (UID: \"b317e9fb-3b83-47f2-971b-7e509a59782f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.903379 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b930522f-1b72-4d47-a50e-883ed2c2facb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dwl4j\" (UID: \"b930522f-1b72-4d47-a50e-883ed2c2facb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.904226 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b930522f-1b72-4d47-a50e-883ed2c2facb-trusted-ca\") pod \"ingress-operator-5b745b69d9-dwl4j\" (UID: \"b930522f-1b72-4d47-a50e-883ed2c2facb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.904315 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9kth\" (UniqueName: \"kubernetes.io/projected/b317e9fb-3b83-47f2-971b-7e509a59782f-kube-api-access-x9kth\") pod \"machine-config-operator-74547568cd-5mjnq\" (UID: \"b317e9fb-3b83-47f2-971b-7e509a59782f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.904367 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b930522f-1b72-4d47-a50e-883ed2c2facb-metrics-tls\") pod \"ingress-operator-5b745b69d9-dwl4j\" (UID: \"b930522f-1b72-4d47-a50e-883ed2c2facb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.904391 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fa74b416-3f9b-45de-a657-79a31f755b9c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wph2s\" (UID: \"fa74b416-3f9b-45de-a657-79a31f755b9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.904408 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7429c3e4-2ad8-4373-807a-b69a11868c49-secret-volume\") pod \"collect-profiles-29339655-svflh\" (UID: \"7429c3e4-2ad8-4373-807a-b69a11868c49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.904425 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/73685109-d1de-4e40-8bfe-3b817aaadabd-apiservice-cert\") pod \"packageserver-d55dfcdfc-9b6fg\" (UID: \"73685109-d1de-4e40-8bfe-3b817aaadabd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.904551 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-plugins-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.904569 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pb54\" (UniqueName: \"kubernetes.io/projected/3593c403-0e2d-4cee-a2fa-8df43371df5a-kube-api-access-4pb54\") pod \"catalog-operator-68c6474976-bhw5m\" (UID: \"3593c403-0e2d-4cee-a2fa-8df43371df5a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.904584 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b317e9fb-3b83-47f2-971b-7e509a59782f-images\") pod \"machine-config-operator-74547568cd-5mjnq\" (UID: \"b317e9fb-3b83-47f2-971b-7e509a59782f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.904608 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/73685109-d1de-4e40-8bfe-3b817aaadabd-webhook-cert\") pod \"packageserver-d55dfcdfc-9b6fg\" (UID: \"73685109-d1de-4e40-8bfe-3b817aaadabd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905321 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vkml\" (UniqueName: \"kubernetes.io/projected/73685109-d1de-4e40-8bfe-3b817aaadabd-kube-api-access-2vkml\") pod \"packageserver-d55dfcdfc-9b6fg\" (UID: \"73685109-d1de-4e40-8bfe-3b817aaadabd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905351 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53c8c837-3981-467c-92ae-2f0fd6cfbb48-metrics-tls\") pod \"dns-default-fgm8n\" (UID: \"53c8c837-3981-467c-92ae-2f0fd6cfbb48\") " pod="openshift-dns/dns-default-fgm8n" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905370 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b9012940-31e8-4aeb-98a5-a08550e781b1-certs\") pod \"machine-config-server-7486c\" (UID: \"b9012940-31e8-4aeb-98a5-a08550e781b1\") " pod="openshift-machine-config-operator/machine-config-server-7486c" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905385 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zbcz\" (UniqueName: \"kubernetes.io/projected/53c8c837-3981-467c-92ae-2f0fd6cfbb48-kube-api-access-4zbcz\") pod \"dns-default-fgm8n\" (UID: \"53c8c837-3981-467c-92ae-2f0fd6cfbb48\") " pod="openshift-dns/dns-default-fgm8n" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905402 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11322ed1-eb0b-4d64-890d-2f4948c102a7-config\") pod \"service-ca-operator-777779d784-lvmtc\" (UID: \"11322ed1-eb0b-4d64-890d-2f4948c102a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905420 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa74b416-3f9b-45de-a657-79a31f755b9c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wph2s\" (UID: \"fa74b416-3f9b-45de-a657-79a31f755b9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905447 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b9012940-31e8-4aeb-98a5-a08550e781b1-node-bootstrap-token\") pod \"machine-config-server-7486c\" (UID: \"b9012940-31e8-4aeb-98a5-a08550e781b1\") " pod="openshift-machine-config-operator/machine-config-server-7486c" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905469 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/487f3a34-1886-4b8f-a9c3-b1d4f7702de2-signing-cabundle\") pod \"service-ca-9c57cc56f-rdq7v\" (UID: \"487f3a34-1886-4b8f-a9c3-b1d4f7702de2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905488 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/487f3a34-1886-4b8f-a9c3-b1d4f7702de2-signing-key\") pod \"service-ca-9c57cc56f-rdq7v\" (UID: \"487f3a34-1886-4b8f-a9c3-b1d4f7702de2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905505 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11322ed1-eb0b-4d64-890d-2f4948c102a7-serving-cert\") pod \"service-ca-operator-777779d784-lvmtc\" (UID: \"11322ed1-eb0b-4d64-890d-2f4948c102a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905522 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njwpj\" (UniqueName: \"kubernetes.io/projected/b9012940-31e8-4aeb-98a5-a08550e781b1-kube-api-access-njwpj\") pod \"machine-config-server-7486c\" (UID: \"b9012940-31e8-4aeb-98a5-a08550e781b1\") " pod="openshift-machine-config-operator/machine-config-server-7486c" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905540 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chzdp\" (UniqueName: \"kubernetes.io/projected/78b85293-78d3-47e8-8546-7d625828b45a-kube-api-access-chzdp\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905556 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nt7q\" (UniqueName: \"kubernetes.io/projected/77988f90-8a72-4a03-9394-9f1971c21484-kube-api-access-2nt7q\") pod \"multus-admission-controller-857f4d67dd-4r245\" (UID: \"77988f90-8a72-4a03-9394-9f1971c21484\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4r245" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905588 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/77988f90-8a72-4a03-9394-9f1971c21484-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4r245\" (UID: \"77988f90-8a72-4a03-9394-9f1971c21484\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4r245" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905603 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7429c3e4-2ad8-4373-807a-b69a11868c49-config-volume\") pod \"collect-profiles-29339655-svflh\" (UID: \"7429c3e4-2ad8-4373-807a-b69a11868c49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905619 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5tgh\" (UniqueName: \"kubernetes.io/projected/53800734-bd36-4fd6-9ed9-f69208f973b2-kube-api-access-d5tgh\") pod \"ingress-canary-xfvq4\" (UID: \"53800734-bd36-4fd6-9ed9-f69208f973b2\") " pod="openshift-ingress-canary/ingress-canary-xfvq4" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.905645 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2zqr\" (UniqueName: \"kubernetes.io/projected/ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1-kube-api-access-p2zqr\") pod \"package-server-manager-789f6589d5-jtjwq\" (UID: \"ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.906192 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzdmp\" (UniqueName: \"kubernetes.io/projected/487f3a34-1886-4b8f-a9c3-b1d4f7702de2-kube-api-access-zzdmp\") pod \"service-ca-9c57cc56f-rdq7v\" (UID: \"487f3a34-1886-4b8f-a9c3-b1d4f7702de2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.906255 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9041feac-829c-4d87-ade9-e09fd97b46ba-srv-cert\") pod \"olm-operator-6b444d44fb-q7r7s\" (UID: \"9041feac-829c-4d87-ade9-e09fd97b46ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.906281 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wckjr\" (UniqueName: \"kubernetes.io/projected/9041feac-829c-4d87-ade9-e09fd97b46ba-kube-api-access-wckjr\") pod \"olm-operator-6b444d44fb-q7r7s\" (UID: \"9041feac-829c-4d87-ade9-e09fd97b46ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.906318 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jtjwq\" (UID: \"ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.906769 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/78a93fc9-5305-44ea-a573-3e54bd52f22d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-h6bwk\" (UID: \"78a93fc9-5305-44ea-a573-3e54bd52f22d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk" Oct 13 18:16:47 crc kubenswrapper[4974]: E1013 18:16:47.906862 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:48.406847038 +0000 UTC m=+143.311213108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.907456 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-mountpoint-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.907762 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18bd98c7-66c4-47fc-b2e4-b53134a191b8-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mxlt\" (UID: \"18bd98c7-66c4-47fc-b2e4-b53134a191b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.907798 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/53c8c837-3981-467c-92ae-2f0fd6cfbb48-config-volume\") pod \"dns-default-fgm8n\" (UID: \"53c8c837-3981-467c-92ae-2f0fd6cfbb48\") " pod="openshift-dns/dns-default-fgm8n" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.908035 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-registration-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.908073 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b317e9fb-3b83-47f2-971b-7e509a59782f-images\") pod \"machine-config-operator-74547568cd-5mjnq\" (UID: \"b317e9fb-3b83-47f2-971b-7e509a59782f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.909130 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-csi-data-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.909299 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-socket-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.910349 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-jtjwq\" (UID: \"ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.910445 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3593c403-0e2d-4cee-a2fa-8df43371df5a-srv-cert\") pod \"catalog-operator-68c6474976-bhw5m\" (UID: \"3593c403-0e2d-4cee-a2fa-8df43371df5a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.913082 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b930522f-1b72-4d47-a50e-883ed2c2facb-metrics-tls\") pod \"ingress-operator-5b745b69d9-dwl4j\" (UID: \"b930522f-1b72-4d47-a50e-883ed2c2facb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.913277 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11322ed1-eb0b-4d64-890d-2f4948c102a7-config\") pod \"service-ca-operator-777779d784-lvmtc\" (UID: \"11322ed1-eb0b-4d64-890d-2f4948c102a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.913637 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9041feac-829c-4d87-ade9-e09fd97b46ba-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q7r7s\" (UID: \"9041feac-829c-4d87-ade9-e09fd97b46ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.914171 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/73685109-d1de-4e40-8bfe-3b817aaadabd-apiservice-cert\") pod \"packageserver-d55dfcdfc-9b6fg\" (UID: \"73685109-d1de-4e40-8bfe-3b817aaadabd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.915186 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/73685109-d1de-4e40-8bfe-3b817aaadabd-tmpfs\") pod \"packageserver-d55dfcdfc-9b6fg\" (UID: \"73685109-d1de-4e40-8bfe-3b817aaadabd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.915769 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7429c3e4-2ad8-4373-807a-b69a11868c49-secret-volume\") pod \"collect-profiles-29339655-svflh\" (UID: \"7429c3e4-2ad8-4373-807a-b69a11868c49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.915826 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/53c8c837-3981-467c-92ae-2f0fd6cfbb48-metrics-tls\") pod \"dns-default-fgm8n\" (UID: \"53c8c837-3981-467c-92ae-2f0fd6cfbb48\") " pod="openshift-dns/dns-default-fgm8n" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.915977 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/78b85293-78d3-47e8-8546-7d625828b45a-plugins-dir\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.916572 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7429c3e4-2ad8-4373-807a-b69a11868c49-config-volume\") pod \"collect-profiles-29339655-svflh\" (UID: \"7429c3e4-2ad8-4373-807a-b69a11868c49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.916780 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b9012940-31e8-4aeb-98a5-a08550e781b1-node-bootstrap-token\") pod \"machine-config-server-7486c\" (UID: \"b9012940-31e8-4aeb-98a5-a08550e781b1\") " pod="openshift-machine-config-operator/machine-config-server-7486c" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.917341 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b317e9fb-3b83-47f2-971b-7e509a59782f-proxy-tls\") pod \"machine-config-operator-74547568cd-5mjnq\" (UID: \"b317e9fb-3b83-47f2-971b-7e509a59782f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.917814 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/487f3a34-1886-4b8f-a9c3-b1d4f7702de2-signing-cabundle\") pod \"service-ca-9c57cc56f-rdq7v\" (UID: \"487f3a34-1886-4b8f-a9c3-b1d4f7702de2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.918190 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3593c403-0e2d-4cee-a2fa-8df43371df5a-profile-collector-cert\") pod \"catalog-operator-68c6474976-bhw5m\" (UID: \"3593c403-0e2d-4cee-a2fa-8df43371df5a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.917865 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa74b416-3f9b-45de-a657-79a31f755b9c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wph2s\" (UID: \"fa74b416-3f9b-45de-a657-79a31f755b9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.919327 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/77988f90-8a72-4a03-9394-9f1971c21484-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-4r245\" (UID: \"77988f90-8a72-4a03-9394-9f1971c21484\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4r245" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.919350 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/53800734-bd36-4fd6-9ed9-f69208f973b2-cert\") pod \"ingress-canary-xfvq4\" (UID: \"53800734-bd36-4fd6-9ed9-f69208f973b2\") " pod="openshift-ingress-canary/ingress-canary-xfvq4" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.919361 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b9012940-31e8-4aeb-98a5-a08550e781b1-certs\") pod \"machine-config-server-7486c\" (UID: \"b9012940-31e8-4aeb-98a5-a08550e781b1\") " pod="openshift-machine-config-operator/machine-config-server-7486c" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.919609 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/18bd98c7-66c4-47fc-b2e4-b53134a191b8-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mxlt\" (UID: \"18bd98c7-66c4-47fc-b2e4-b53134a191b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.920180 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/73685109-d1de-4e40-8bfe-3b817aaadabd-webhook-cert\") pod \"packageserver-d55dfcdfc-9b6fg\" (UID: \"73685109-d1de-4e40-8bfe-3b817aaadabd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.920346 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fa74b416-3f9b-45de-a657-79a31f755b9c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wph2s\" (UID: \"fa74b416-3f9b-45de-a657-79a31f755b9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.920961 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b317e9fb-3b83-47f2-971b-7e509a59782f-auth-proxy-config\") pod \"machine-config-operator-74547568cd-5mjnq\" (UID: \"b317e9fb-3b83-47f2-971b-7e509a59782f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.920970 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9041feac-829c-4d87-ade9-e09fd97b46ba-srv-cert\") pod \"olm-operator-6b444d44fb-q7r7s\" (UID: \"9041feac-829c-4d87-ade9-e09fd97b46ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.921715 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/487f3a34-1886-4b8f-a9c3-b1d4f7702de2-signing-key\") pod \"service-ca-9c57cc56f-rdq7v\" (UID: \"487f3a34-1886-4b8f-a9c3-b1d4f7702de2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.921873 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11322ed1-eb0b-4d64-890d-2f4948c102a7-serving-cert\") pod \"service-ca-operator-777779d784-lvmtc\" (UID: \"11322ed1-eb0b-4d64-890d-2f4948c102a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.926891 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.931889 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fchnm\" (UniqueName: \"kubernetes.io/projected/88ad4d09-fc16-4a33-90de-64fe96f64c12-kube-api-access-fchnm\") pod \"etcd-operator-b45778765-xgp8t\" (UID: \"88ad4d09-fc16-4a33-90de-64fe96f64c12\") " pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.941542 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xthtl\" (UniqueName: \"kubernetes.io/projected/c739a775-23db-45ab-b1d0-4860b2353cab-kube-api-access-xthtl\") pod \"machine-config-controller-84d6567774-9j6ns\" (UID: \"c739a775-23db-45ab-b1d0-4860b2353cab\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.947363 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.955207 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.962154 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6485l\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-kube-api-access-6485l\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:47 crc kubenswrapper[4974]: I1013 18:16:47.991516 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkhg5\" (UniqueName: \"kubernetes.io/projected/e713c27c-e93f-44ce-8e9a-d908099b699c-kube-api-access-xkhg5\") pod \"router-default-5444994796-9p2fb\" (UID: \"e713c27c-e93f-44ce-8e9a-d908099b699c\") " pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.008644 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.008995 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:48 crc kubenswrapper[4974]: E1013 18:16:48.009342 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:48.509328443 +0000 UTC m=+143.413694523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.011939 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a933da9-d7fc-40a5-ae28-4fb8e641722d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lbfzq\" (UID: \"6a933da9-d7fc-40a5-ae28-4fb8e641722d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.020777 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l29rd\" (UniqueName: \"kubernetes.io/projected/9afdbe18-a37f-4678-9552-026d1d55946d-kube-api-access-l29rd\") pod \"console-operator-58897d9998-tp4tf\" (UID: \"9afdbe18-a37f-4678-9552-026d1d55946d\") " pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.045516 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7sxk\" (UniqueName: \"kubernetes.io/projected/34b64103-310c-4c28-80ca-decee688ee64-kube-api-access-c7sxk\") pod \"downloads-7954f5f757-rr898\" (UID: \"34b64103-310c-4c28-80ca-decee688ee64\") " pod="openshift-console/downloads-7954f5f757-rr898" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.060808 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c22cg\" (UniqueName: \"kubernetes.io/projected/57abf354-bc90-422f-a35d-c0185fa85ca1-kube-api-access-c22cg\") pod \"dns-operator-744455d44c-dsjgz\" (UID: \"57abf354-bc90-422f-a35d-c0185fa85ca1\") " pod="openshift-dns-operator/dns-operator-744455d44c-dsjgz" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.104370 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjfjm\" (UniqueName: \"kubernetes.io/projected/78a93fc9-5305-44ea-a573-3e54bd52f22d-kube-api-access-wjfjm\") pod \"control-plane-machine-set-operator-78cbb6b69f-h6bwk\" (UID: \"78a93fc9-5305-44ea-a573-3e54bd52f22d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.110058 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:48 crc kubenswrapper[4974]: E1013 18:16:48.110532 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:48.610516613 +0000 UTC m=+143.514882693 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.126628 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dv84\" (UniqueName: \"kubernetes.io/projected/11322ed1-eb0b-4d64-890d-2f4948c102a7-kube-api-access-8dv84\") pod \"service-ca-operator-777779d784-lvmtc\" (UID: \"11322ed1-eb0b-4d64-890d-2f4948c102a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.156588 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knkc8\" (UniqueName: \"kubernetes.io/projected/7429c3e4-2ad8-4373-807a-b69a11868c49-kube-api-access-knkc8\") pod \"collect-profiles-29339655-svflh\" (UID: \"7429c3e4-2ad8-4373-807a-b69a11868c49\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.159017 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb"] Oct 13 18:16:48 crc kubenswrapper[4974]: W1013 18:16:48.180565 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc139dd79_4db7_4c96_83af_378886acfcf8.slice/crio-3e52893c54180b1a058f979f58c6d8a8c65de293648acb8dbaef8871065542d5 WatchSource:0}: Error finding container 3e52893c54180b1a058f979f58c6d8a8c65de293648acb8dbaef8871065542d5: Status 404 returned error can't find the container with id 3e52893c54180b1a058f979f58c6d8a8c65de293648acb8dbaef8871065542d5 Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.181073 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6wcx\" (UniqueName: \"kubernetes.io/projected/18bd98c7-66c4-47fc-b2e4-b53134a191b8-kube-api-access-m6wcx\") pod \"kube-storage-version-migrator-operator-b67b599dd-9mxlt\" (UID: \"18bd98c7-66c4-47fc-b2e4-b53134a191b8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.183268 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.199062 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh7fr\" (UniqueName: \"kubernetes.io/projected/b930522f-1b72-4d47-a50e-883ed2c2facb-kube-api-access-hh7fr\") pod \"ingress-operator-5b745b69d9-dwl4j\" (UID: \"b930522f-1b72-4d47-a50e-883ed2c2facb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.200181 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp"] Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.212562 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:48 crc kubenswrapper[4974]: E1013 18:16:48.212962 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:48.712944377 +0000 UTC m=+143.617310457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.213299 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-rr898" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.215101 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2qqc\" (UniqueName: \"kubernetes.io/projected/fa74b416-3f9b-45de-a657-79a31f755b9c-kube-api-access-g2qqc\") pod \"marketplace-operator-79b997595-wph2s\" (UID: \"fa74b416-3f9b-45de-a657-79a31f755b9c\") " pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.219954 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.220994 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b930522f-1b72-4d47-a50e-883ed2c2facb-bound-sa-token\") pod \"ingress-operator-5b745b69d9-dwl4j\" (UID: \"b930522f-1b72-4d47-a50e-883ed2c2facb\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.235988 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-dsjgz" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.246856 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9kth\" (UniqueName: \"kubernetes.io/projected/b317e9fb-3b83-47f2-971b-7e509a59782f-kube-api-access-x9kth\") pod \"machine-config-operator-74547568cd-5mjnq\" (UID: \"b317e9fb-3b83-47f2-971b-7e509a59782f\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.262499 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vkml\" (UniqueName: \"kubernetes.io/projected/73685109-d1de-4e40-8bfe-3b817aaadabd-kube-api-access-2vkml\") pod \"packageserver-d55dfcdfc-9b6fg\" (UID: \"73685109-d1de-4e40-8bfe-3b817aaadabd\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.270218 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.282172 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zbcz\" (UniqueName: \"kubernetes.io/projected/53c8c837-3981-467c-92ae-2f0fd6cfbb48-kube-api-access-4zbcz\") pod \"dns-default-fgm8n\" (UID: \"53c8c837-3981-467c-92ae-2f0fd6cfbb48\") " pod="openshift-dns/dns-default-fgm8n" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.300356 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.315742 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wtbj\" (UniqueName: \"kubernetes.io/projected/83624c06-f2ff-4706-b4cd-1d41edb91898-kube-api-access-5wtbj\") pod \"migrator-59844c95c7-kkk6m\" (UID: \"83624c06-f2ff-4706-b4cd-1d41edb91898\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kkk6m" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.324334 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njwpj\" (UniqueName: \"kubernetes.io/projected/b9012940-31e8-4aeb-98a5-a08550e781b1-kube-api-access-njwpj\") pod \"machine-config-server-7486c\" (UID: \"b9012940-31e8-4aeb-98a5-a08550e781b1\") " pod="openshift-machine-config-operator/machine-config-server-7486c" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.324985 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kkk6m" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.325587 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:48 crc kubenswrapper[4974]: E1013 18:16:48.325998 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:48.825983638 +0000 UTC m=+143.730349708 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.326066 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.326943 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.346389 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.363605 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chzdp\" (UniqueName: \"kubernetes.io/projected/78b85293-78d3-47e8-8546-7d625828b45a-kube-api-access-chzdp\") pod \"csi-hostpathplugin-ng69k\" (UID: \"78b85293-78d3-47e8-8546-7d625828b45a\") " pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.364486 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.373760 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.381282 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.389333 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7"] Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.396181 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nt7q\" (UniqueName: \"kubernetes.io/projected/77988f90-8a72-4a03-9394-9f1971c21484-kube-api-access-2nt7q\") pod \"multus-admission-controller-857f4d67dd-4r245\" (UID: \"77988f90-8a72-4a03-9394-9f1971c21484\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-4r245" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.398025 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.401274 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pb54\" (UniqueName: \"kubernetes.io/projected/3593c403-0e2d-4cee-a2fa-8df43371df5a-kube-api-access-4pb54\") pod \"catalog-operator-68c6474976-bhw5m\" (UID: \"3593c403-0e2d-4cee-a2fa-8df43371df5a\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.402846 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fgm8n" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.417102 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.417558 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzdmp\" (UniqueName: \"kubernetes.io/projected/487f3a34-1886-4b8f-a9c3-b1d4f7702de2-kube-api-access-zzdmp\") pod \"service-ca-9c57cc56f-rdq7v\" (UID: \"487f3a34-1886-4b8f-a9c3-b1d4f7702de2\") " pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.440762 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-ng69k" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.441353 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-7486c" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.441764 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:48 crc kubenswrapper[4974]: E1013 18:16:48.442172 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:48.942159648 +0000 UTC m=+143.846525728 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.454039 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5tgh\" (UniqueName: \"kubernetes.io/projected/53800734-bd36-4fd6-9ed9-f69208f973b2-kube-api-access-d5tgh\") pod \"ingress-canary-xfvq4\" (UID: \"53800734-bd36-4fd6-9ed9-f69208f973b2\") " pod="openshift-ingress-canary/ingress-canary-xfvq4" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.454963 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns"] Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.458439 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2zqr\" (UniqueName: \"kubernetes.io/projected/ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1-kube-api-access-p2zqr\") pod \"package-server-manager-789f6589d5-jtjwq\" (UID: \"ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.497363 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-xgp8t"] Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.505495 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wckjr\" (UniqueName: \"kubernetes.io/projected/9041feac-829c-4d87-ade9-e09fd97b46ba-kube-api-access-wckjr\") pod \"olm-operator-6b444d44fb-q7r7s\" (UID: \"9041feac-829c-4d87-ade9-e09fd97b46ba\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.543162 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:48 crc kubenswrapper[4974]: E1013 18:16:48.543580 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:49.043565293 +0000 UTC m=+143.947931373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:48 crc kubenswrapper[4974]: W1013 18:16:48.590795 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ce9e429_c79c_41d2_82d6_f3a432fd3173.slice/crio-7062d987b758845472e66eb6991c13a384bf0b7e8ea60c73df6b3857d0836471 WatchSource:0}: Error finding container 7062d987b758845472e66eb6991c13a384bf0b7e8ea60c73df6b3857d0836471: Status 404 returned error can't find the container with id 7062d987b758845472e66eb6991c13a384bf0b7e8ea60c73df6b3857d0836471 Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.636031 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.647303 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:48 crc kubenswrapper[4974]: E1013 18:16:48.647712 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:49.147700635 +0000 UTC m=+144.052066715 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.647819 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.657406 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-4r245" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.688587 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.691578 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" event={"ID":"820c88f3-f632-4375-b569-d796628c8f73","Type":"ContainerStarted","Data":"bb98fdd7a75d6fd5a4d665be458b3c7e473223f91f3cc7653e0d21916a9ac7f3"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.691618 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" event={"ID":"820c88f3-f632-4375-b569-d796628c8f73","Type":"ContainerStarted","Data":"74dad6e93b833ff723265b4bd5a6b5c91d90dab3ddebcc8a04f72b12a4264a93"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.694427 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.696116 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" event={"ID":"c739a775-23db-45ab-b1d0-4860b2353cab","Type":"ContainerStarted","Data":"7d0fb0f49e85fb05d7612834b184baf3c12fa1b8ef83b685e97378cc2075f876"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.726379 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" event={"ID":"2ffe8fb4-5a71-4ef6-aef6-96603b8e7551","Type":"ContainerStarted","Data":"6dcf2a7e02c9f472dd097ff98b24a78c3df826c04b4ac59d7c31be760a3f1b31"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.726419 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" event={"ID":"2ffe8fb4-5a71-4ef6-aef6-96603b8e7551","Type":"ContainerStarted","Data":"7dcce9f619e515b086a537ba2edc80b0dd2a1431ecc6bbce7326a1f492f78605"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.747950 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xfvq4" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.749573 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:48 crc kubenswrapper[4974]: E1013 18:16:48.752053 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:49.252020893 +0000 UTC m=+144.156386973 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.759559 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9p2fb" event={"ID":"e713c27c-e93f-44ce-8e9a-d908099b699c","Type":"ContainerStarted","Data":"374d1045be1b7fc8d57679a3a6b490be7f2223215d72c881ded9eacfa61fa203"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.776793 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" event={"ID":"4b386d31-0a67-4d5a-910d-79a22339feaf","Type":"ContainerStarted","Data":"9642c20c597734b868e5ebc85c33f1c34f5f078ee972fff4e236293005325eb7"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.786897 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" event={"ID":"4b386d31-0a67-4d5a-910d-79a22339feaf","Type":"ContainerStarted","Data":"9d8d7529be56509f1c59e868b0c2c378eb5ca9f8eab65e6e217c14db115bf5c1"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.786915 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" event={"ID":"9ce9e429-c79c-41d2-82d6-f3a432fd3173","Type":"ContainerStarted","Data":"7062d987b758845472e66eb6991c13a384bf0b7e8ea60c73df6b3857d0836471"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.804315 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" event={"ID":"88ad4d09-fc16-4a33-90de-64fe96f64c12","Type":"ContainerStarted","Data":"51f8654e979505bccab6a60f24b152b47d27854953dc2d5deaa75fad9f69f42c"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.831806 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" event={"ID":"73a7134a-5187-4b6e-aa08-40146a99e93e","Type":"ContainerStarted","Data":"8c7a154d0f256890776319afb036598c62299c42074af025294a532d12b4400f"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.832456 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" event={"ID":"73a7134a-5187-4b6e-aa08-40146a99e93e","Type":"ContainerStarted","Data":"84fe6aecf26f77c2ce26a2ef26b3ec60b10833462aa73609820320326ed38681"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.835146 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" event={"ID":"c139dd79-4db7-4c96-83af-378886acfcf8","Type":"ContainerStarted","Data":"56141b9ae54711de31a9ec9660b358ff07b786d5ba3174a779d0fcdb06123b2b"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.835196 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" event={"ID":"c139dd79-4db7-4c96-83af-378886acfcf8","Type":"ContainerStarted","Data":"3e52893c54180b1a058f979f58c6d8a8c65de293648acb8dbaef8871065542d5"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.854896 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:48 crc kubenswrapper[4974]: E1013 18:16:48.858970 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:49.358955313 +0000 UTC m=+144.263321483 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.881076 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" event={"ID":"ba55033e-0bef-4588-bdac-820122c4fd4e","Type":"ContainerStarted","Data":"01ed295816b1a95947d9a375e063ea0f50f773e6517423e147ef3f83945e2566"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.881140 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" event={"ID":"ba55033e-0bef-4588-bdac-820122c4fd4e","Type":"ContainerStarted","Data":"aa1a3e7f80f007240ba88fdc6635e01c3e75e05b1d988ffb5ba66498a60cef0f"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.904527 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" event={"ID":"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8","Type":"ContainerStarted","Data":"36b4278c9cbe56db658ff76c8a63b89816cd1036654e128828fd3a5637c10b21"} Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.915874 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.916296 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:16:48 crc kubenswrapper[4974]: I1013 18:16:48.957252 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:48 crc kubenswrapper[4974]: E1013 18:16:48.957697 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:49.457678253 +0000 UTC m=+144.362044333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.059579 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:49 crc kubenswrapper[4974]: E1013 18:16:49.064279 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:49.564264124 +0000 UTC m=+144.468630204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.101223 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-22fnv" podStartSLOduration=122.101203311 podStartE2EDuration="2m2.101203311s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:49.098899986 +0000 UTC m=+144.003266086" watchObservedRunningTime="2025-10-13 18:16:49.101203311 +0000 UTC m=+144.005569401" Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.123155 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-np2sc" podStartSLOduration=122.123139196 podStartE2EDuration="2m2.123139196s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:49.12219553 +0000 UTC m=+144.026561620" watchObservedRunningTime="2025-10-13 18:16:49.123139196 +0000 UTC m=+144.027505276" Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.168063 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:49 crc kubenswrapper[4974]: E1013 18:16:49.168503 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:49.668483548 +0000 UTC m=+144.572849628 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.201968 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-lbflg" podStartSLOduration=122.201949237 podStartE2EDuration="2m2.201949237s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:49.201247928 +0000 UTC m=+144.105614008" watchObservedRunningTime="2025-10-13 18:16:49.201949237 +0000 UTC m=+144.106315317" Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.274626 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:49 crc kubenswrapper[4974]: E1013 18:16:49.275016 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:49.775002587 +0000 UTC m=+144.679368667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.333284 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.336433 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tp4tf"] Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.376474 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:49 crc kubenswrapper[4974]: E1013 18:16:49.376795 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:49.876780293 +0000 UTC m=+144.781146373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.400454 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" podStartSLOduration=121.400434707 podStartE2EDuration="2m1.400434707s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:49.358647604 +0000 UTC m=+144.263013684" watchObservedRunningTime="2025-10-13 18:16:49.400434707 +0000 UTC m=+144.304800777" Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.400826 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-jbznq" podStartSLOduration=122.400819808 podStartE2EDuration="2m2.400819808s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:49.400140368 +0000 UTC m=+144.304506448" watchObservedRunningTime="2025-10-13 18:16:49.400819808 +0000 UTC m=+144.305185888" Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.442970 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-dsjgz"] Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.448535 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq"] Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.451191 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg"] Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.458522 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-rr898"] Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.479486 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:49 crc kubenswrapper[4974]: E1013 18:16:49.479819 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:49.979806864 +0000 UTC m=+144.884172944 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.580542 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:49 crc kubenswrapper[4974]: E1013 18:16:49.581604 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:50.08158396 +0000 UTC m=+144.985950040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.625544 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" podStartSLOduration=122.625517602 podStartE2EDuration="2m2.625517602s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:49.594260455 +0000 UTC m=+144.498626555" watchObservedRunningTime="2025-10-13 18:16:49.625517602 +0000 UTC m=+144.529883682" Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.697164 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.697718 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" podStartSLOduration=122.697693918 podStartE2EDuration="2m2.697693918s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:49.696931466 +0000 UTC m=+144.601297546" watchObservedRunningTime="2025-10-13 18:16:49.697693918 +0000 UTC m=+144.602059998" Oct 13 18:16:49 crc kubenswrapper[4974]: E1013 18:16:49.699667 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:50.199637122 +0000 UTC m=+145.104003202 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.777557 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-bpshp" podStartSLOduration=122.777541038 podStartE2EDuration="2m2.777541038s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:49.775930873 +0000 UTC m=+144.680296953" watchObservedRunningTime="2025-10-13 18:16:49.777541038 +0000 UTC m=+144.681907118" Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.807421 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-m7pjr" podStartSLOduration=122.807400946 podStartE2EDuration="2m2.807400946s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:49.806099579 +0000 UTC m=+144.710465669" watchObservedRunningTime="2025-10-13 18:16:49.807400946 +0000 UTC m=+144.711767026" Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.821033 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:49 crc kubenswrapper[4974]: E1013 18:16:49.821458 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:50.32143592 +0000 UTC m=+145.225802000 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.919714 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fgm8n"] Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.926035 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq"] Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.930482 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:49 crc kubenswrapper[4974]: E1013 18:16:49.930788 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:50.430777138 +0000 UTC m=+145.335143218 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:49 crc kubenswrapper[4974]: I1013 18:16:49.962797 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.005819 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" event={"ID":"6a933da9-d7fc-40a5-ae28-4fb8e641722d","Type":"ContainerStarted","Data":"f9f1c7dcf235c0359afc7b00484e143a367cf2fa7f44a4867c189383a9e919a3"} Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.013588 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x24fx" podStartSLOduration=123.013575821 podStartE2EDuration="2m3.013575821s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:50.012704016 +0000 UTC m=+144.917070096" watchObservedRunningTime="2025-10-13 18:16:50.013575821 +0000 UTC m=+144.917941891" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.015969 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9p2fb" event={"ID":"e713c27c-e93f-44ce-8e9a-d908099b699c","Type":"ContainerStarted","Data":"98dbf19cd2513b4034f02b859cf58f63b3aaeb9eb894081b6d90c583281e3d27"} Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.033743 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:50 crc kubenswrapper[4974]: E1013 18:16:50.034767 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:50.534753135 +0000 UTC m=+145.439119215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.050981 4974 generic.go:334] "Generic (PLEG): container finished" podID="9ce9e429-c79c-41d2-82d6-f3a432fd3173" containerID="447f21975656f321367634b1034207bbdd466994c970525437a20ccc575eef53" exitCode=0 Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.051048 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" event={"ID":"9ce9e429-c79c-41d2-82d6-f3a432fd3173","Type":"ContainerDied","Data":"447f21975656f321367634b1034207bbdd466994c970525437a20ccc575eef53"} Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.069898 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" podStartSLOduration=123.069881951 podStartE2EDuration="2m3.069881951s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:50.069580562 +0000 UTC m=+144.973946642" watchObservedRunningTime="2025-10-13 18:16:50.069881951 +0000 UTC m=+144.974248021" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.073990 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.076535 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" event={"ID":"e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74","Type":"ContainerStarted","Data":"7c323ea8fa9d5fe16964119b085c6eb94882959e00d81adf15771b4ef0d099e7"} Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.091167 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" event={"ID":"c739a775-23db-45ab-b1d0-4860b2353cab","Type":"ContainerStarted","Data":"7746409e5e6a034b3c2340f7921c927b6e40a2c2af82e7688ee857561b10a190"} Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.091209 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" event={"ID":"c739a775-23db-45ab-b1d0-4860b2353cab","Type":"ContainerStarted","Data":"d9128109a5864af80d95849445398a89a0527e17fe78d5d7620cf6ad7c948218"} Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.112314 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wph2s"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.112714 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-khv5p" podStartSLOduration=123.110637984 podStartE2EDuration="2m3.110637984s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:50.104091101 +0000 UTC m=+145.008457181" watchObservedRunningTime="2025-10-13 18:16:50.110637984 +0000 UTC m=+145.015004064" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.135422 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.135986 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" event={"ID":"f8a0815a-ba28-4e47-ba48-2b6e9270a3d8","Type":"ContainerStarted","Data":"f8ce86de78d4263fae98d0517cf9607e622940a3b68e0091dd1925d032fdf25f"} Oct 13 18:16:50 crc kubenswrapper[4974]: E1013 18:16:50.138108 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:50.638033103 +0000 UTC m=+145.542399233 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.141140 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dsjgz" event={"ID":"57abf354-bc90-422f-a35d-c0185fa85ca1","Type":"ContainerStarted","Data":"76abb6025d735095db55553610c01702e13dd3c48659641bbc1f426687b446f8"} Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.175861 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" event={"ID":"88ad4d09-fc16-4a33-90de-64fe96f64c12","Type":"ContainerStarted","Data":"5f8d2f2e1aa6dba6a8d9c27e862c3feea02b606617d74324f32d02e430d45cd3"} Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.202422 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rr898" event={"ID":"34b64103-310c-4c28-80ca-decee688ee64","Type":"ContainerStarted","Data":"439b0f576f7608366854ce8bc708d92e389f8ef44da5cc0c7e52de5909999993"} Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.218309 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tp4tf" event={"ID":"9afdbe18-a37f-4678-9552-026d1d55946d","Type":"ContainerStarted","Data":"7242a18ccbf9e15c92e4a7773d3dc3d7af4be258d62f45377a1ab5b07137c093"} Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.220085 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.230955 4974 patch_prober.go:28] interesting pod/console-operator-58897d9998-tp4tf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.231006 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tp4tf" podUID="9afdbe18-a37f-4678-9552-026d1d55946d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.236908 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:50 crc kubenswrapper[4974]: E1013 18:16:50.245596 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:50.74554887 +0000 UTC m=+145.649914950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.269423 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-b27hb" podStartSLOduration=123.269404129 podStartE2EDuration="2m3.269404129s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:50.249584403 +0000 UTC m=+145.153950483" watchObservedRunningTime="2025-10-13 18:16:50.269404129 +0000 UTC m=+145.173770209" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.269563 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.271315 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.283137 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7486c" event={"ID":"b9012940-31e8-4aeb-98a5-a08550e781b1","Type":"ContainerStarted","Data":"0d4eae7f12bb861cab490d011b4f4c0bf02f95b5b2ad2fe74109808153c3631b"} Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.283186 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-7486c" event={"ID":"b9012940-31e8-4aeb-98a5-a08550e781b1","Type":"ContainerStarted","Data":"e8ca01f25d699e9f03d27c3745ec2436341a937640bbfb84a76be6ad4aa86370"} Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.292460 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-knlnk" podStartSLOduration=122.292437516 podStartE2EDuration="2m2.292437516s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:50.2851003 +0000 UTC m=+145.189466380" watchObservedRunningTime="2025-10-13 18:16:50.292437516 +0000 UTC m=+145.196803596" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.326048 4974 patch_prober.go:28] interesting pod/router-default-5444994796-9p2fb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 18:16:50 crc kubenswrapper[4974]: [-]has-synced failed: reason withheld Oct 13 18:16:50 crc kubenswrapper[4974]: [+]process-running ok Oct 13 18:16:50 crc kubenswrapper[4974]: healthz check failed Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.326175 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9p2fb" podUID="e713c27c-e93f-44ce-8e9a-d908099b699c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.341740 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-kkk6m"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.351937 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:50 crc kubenswrapper[4974]: E1013 18:16:50.383284 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:50.883265244 +0000 UTC m=+145.787631324 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:50 crc kubenswrapper[4974]: W1013 18:16:50.446384 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83624c06_f2ff_4706_b4cd_1d41edb91898.slice/crio-0829b51789d36f7424c77b230d454bd69915fe7b803b770e03a60e4929ccbd55 WatchSource:0}: Error finding container 0829b51789d36f7424c77b230d454bd69915fe7b803b770e03a60e4929ccbd55: Status 404 returned error can't find the container with id 0829b51789d36f7424c77b230d454bd69915fe7b803b770e03a60e4929ccbd55 Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.468061 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.480194 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-xgp8t" podStartSLOduration=123.480172284 podStartE2EDuration="2m3.480172284s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:50.477066866 +0000 UTC m=+145.381432946" watchObservedRunningTime="2025-10-13 18:16:50.480172284 +0000 UTC m=+145.384538364" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.491538 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:50 crc kubenswrapper[4974]: E1013 18:16:50.491827 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:50.99181486 +0000 UTC m=+145.896180940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.496739 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.496807 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xfvq4"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.512534 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.512593 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.555238 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9p2fb" podStartSLOduration=123.555217759 podStartE2EDuration="2m3.555217759s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:50.548083559 +0000 UTC m=+145.452449639" watchObservedRunningTime="2025-10-13 18:16:50.555217759 +0000 UTC m=+145.459583839" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.578097 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-ng69k"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.592546 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:50 crc kubenswrapper[4974]: E1013 18:16:50.592922 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:51.092909737 +0000 UTC m=+145.997275817 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.596358 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.596410 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-rdq7v"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.622221 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.635462 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-4r245"] Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.694544 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:50 crc kubenswrapper[4974]: E1013 18:16:50.694858 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:51.194840767 +0000 UTC m=+146.099206847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.727970 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-9j6ns" podStartSLOduration=122.727953466 podStartE2EDuration="2m2.727953466s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:50.682157741 +0000 UTC m=+145.586523821" watchObservedRunningTime="2025-10-13 18:16:50.727953466 +0000 UTC m=+145.632319546" Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.728736 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-tp4tf" podStartSLOduration=123.728728728 podStartE2EDuration="2m3.728728728s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:50.722188164 +0000 UTC m=+145.626554244" watchObservedRunningTime="2025-10-13 18:16:50.728728728 +0000 UTC m=+145.633094818" Oct 13 18:16:50 crc kubenswrapper[4974]: W1013 18:16:50.741024 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73685109_d1de_4e40_8bfe_3b817aaadabd.slice/crio-f9652c9c7b71c8b42a9bc0f8132df9ad38359910a1a93e4c59c698c4328e04f3 WatchSource:0}: Error finding container f9652c9c7b71c8b42a9bc0f8132df9ad38359910a1a93e4c59c698c4328e04f3: Status 404 returned error can't find the container with id f9652c9c7b71c8b42a9bc0f8132df9ad38359910a1a93e4c59c698c4328e04f3 Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.795623 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:50 crc kubenswrapper[4974]: E1013 18:16:50.795986 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:51.295971425 +0000 UTC m=+146.200337515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.899043 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:50 crc kubenswrapper[4974]: E1013 18:16:50.899563 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:51.399548131 +0000 UTC m=+146.303914211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:50 crc kubenswrapper[4974]: I1013 18:16:50.901618 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-7486c" podStartSLOduration=5.901597068 podStartE2EDuration="5.901597068s" podCreationTimestamp="2025-10-13 18:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:50.812266592 +0000 UTC m=+145.716632672" watchObservedRunningTime="2025-10-13 18:16:50.901597068 +0000 UTC m=+145.805963148" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.037210 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:51 crc kubenswrapper[4974]: E1013 18:16:51.037627 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:51.537609905 +0000 UTC m=+146.441975985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.138537 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:51 crc kubenswrapper[4974]: E1013 18:16:51.138852 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:51.638837055 +0000 UTC m=+146.543203135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.244914 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:51 crc kubenswrapper[4974]: E1013 18:16:51.245209 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:51.745196979 +0000 UTC m=+146.649563059 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.295295 4974 patch_prober.go:28] interesting pod/router-default-5444994796-9p2fb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 18:16:51 crc kubenswrapper[4974]: [-]has-synced failed: reason withheld Oct 13 18:16:51 crc kubenswrapper[4974]: [+]process-running ok Oct 13 18:16:51 crc kubenswrapper[4974]: healthz check failed Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.295355 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9p2fb" podUID="e713c27c-e93f-44ce-8e9a-d908099b699c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.345096 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" event={"ID":"6a933da9-d7fc-40a5-ae28-4fb8e641722d","Type":"ContainerStarted","Data":"9d8dc387bca3dbd196a67ced32e570a84df900ba366000e31b9f7374d57327ff"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.347438 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:51 crc kubenswrapper[4974]: E1013 18:16:51.347862 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:51.847839828 +0000 UTC m=+146.752205908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.377221 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" event={"ID":"9ce9e429-c79c-41d2-82d6-f3a432fd3173","Type":"ContainerStarted","Data":"f937243d4b8311ed931f53e078cfcce67cb7030347c5ed7cdcf3bd6925dff178"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.400102 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" event={"ID":"18bd98c7-66c4-47fc-b2e4-b53134a191b8","Type":"ContainerStarted","Data":"13717b26e681615598412d03503a01a19a8123906ee43909a1ce6012c8064998"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.407803 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-rr898" event={"ID":"34b64103-310c-4c28-80ca-decee688ee64","Type":"ContainerStarted","Data":"08daa161fac587f7b8a86d77dec438ba8a89f6a9b91d875472310aead83db8f4"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.408721 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-rr898" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.412021 4974 patch_prober.go:28] interesting pod/downloads-7954f5f757-rr898 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.412151 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rr898" podUID="34b64103-310c-4c28-80ca-decee688ee64" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.413628 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" event={"ID":"b930522f-1b72-4d47-a50e-883ed2c2facb","Type":"ContainerStarted","Data":"b559565db6fbad4777891727e6e07c99ada94b1f2528e90eda946d74d0e38fb5"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.413732 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" event={"ID":"b930522f-1b72-4d47-a50e-883ed2c2facb","Type":"ContainerStarted","Data":"c68d194c2e52659023387f89f95d3dd9418a176bf0f06a41f69c105306a4e960"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.418347 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" event={"ID":"7429c3e4-2ad8-4373-807a-b69a11868c49","Type":"ContainerStarted","Data":"6987a458e60aa4a01a60b7ea12f5880471c2d961c909c4f01d58aa5a4237f997"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.422042 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" event={"ID":"73685109-d1de-4e40-8bfe-3b817aaadabd","Type":"ContainerStarted","Data":"f9652c9c7b71c8b42a9bc0f8132df9ad38359910a1a93e4c59c698c4328e04f3"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.444960 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" event={"ID":"11322ed1-eb0b-4d64-890d-2f4948c102a7","Type":"ContainerStarted","Data":"6fff77812899ec36f833bf5fd5fb1b7c05754eed1de3dbe912e541c11dfd0811"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.453538 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:51 crc kubenswrapper[4974]: E1013 18:16:51.455146 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:51.955122169 +0000 UTC m=+146.859488479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.488258 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" event={"ID":"ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1","Type":"ContainerStarted","Data":"bac4dc88b3cd832bf1bbcebe3cb25a33a6b654db4288858c5a72bcb56cbed416"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.490495 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-rr898" podStartSLOduration=124.490482101 podStartE2EDuration="2m4.490482101s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:51.488057343 +0000 UTC m=+146.392423413" watchObservedRunningTime="2025-10-13 18:16:51.490482101 +0000 UTC m=+146.394848191" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.491136 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.491179 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.546888 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kkk6m" event={"ID":"83624c06-f2ff-4706-b4cd-1d41edb91898","Type":"ContainerStarted","Data":"bd8931d83ef7dacd5a0dc25b87cce78a90829edd7b5c8ccf02d345657ba78e5d"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.546938 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kkk6m" event={"ID":"83624c06-f2ff-4706-b4cd-1d41edb91898","Type":"ContainerStarted","Data":"0829b51789d36f7424c77b230d454bd69915fe7b803b770e03a60e4929ccbd55"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.555473 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:51 crc kubenswrapper[4974]: E1013 18:16:51.557126 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:52.05710843 +0000 UTC m=+146.961474510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.557417 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tp4tf" event={"ID":"9afdbe18-a37f-4678-9552-026d1d55946d","Type":"ContainerStarted","Data":"ee53df22ea12b7d1b288152175d6395e9ce102622df385fd7ddc1a4e70833848"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.558862 4974 patch_prober.go:28] interesting pod/console-operator-58897d9998-tp4tf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.558893 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tp4tf" podUID="9afdbe18-a37f-4678-9552-026d1d55946d" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.12:8443/readyz\": dial tcp 10.217.0.12:8443: connect: connection refused" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.586583 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" event={"ID":"fa74b416-3f9b-45de-a657-79a31f755b9c","Type":"ContainerStarted","Data":"3dfec715d330631610c39ecf853902834fe263a2b2c4c604c76da49c6ddd88ae"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.586627 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" event={"ID":"fa74b416-3f9b-45de-a657-79a31f755b9c","Type":"ContainerStarted","Data":"470318fda49410b6d5543b35484039d375b46cf4d5dcccb891aaa0ae29748da7"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.588123 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.591009 4974 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wph2s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.591054 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" podUID="fa74b416-3f9b-45de-a657-79a31f755b9c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.614330 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" event={"ID":"3593c403-0e2d-4cee-a2fa-8df43371df5a","Type":"ContainerStarted","Data":"284db4e0f23464397b78b67ac4da9f123916d8b1c441410d9d82ac3736b88395"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.615445 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" podStartSLOduration=123.615423937 podStartE2EDuration="2m3.615423937s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:51.614235773 +0000 UTC m=+146.518601853" watchObservedRunningTime="2025-10-13 18:16:51.615423937 +0000 UTC m=+146.519790017" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.615841 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lbfzq" podStartSLOduration=124.615835088 podStartE2EDuration="2m4.615835088s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:51.576620828 +0000 UTC m=+146.480986908" watchObservedRunningTime="2025-10-13 18:16:51.615835088 +0000 UTC m=+146.520201168" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.647205 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" podStartSLOduration=123.647185038 podStartE2EDuration="2m3.647185038s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:51.645959213 +0000 UTC m=+146.550325293" watchObservedRunningTime="2025-10-13 18:16:51.647185038 +0000 UTC m=+146.551551118" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.656864 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:51 crc kubenswrapper[4974]: E1013 18:16:51.659416 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:52.159400731 +0000 UTC m=+147.063766811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.661643 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xfvq4" event={"ID":"53800734-bd36-4fd6-9ed9-f69208f973b2","Type":"ContainerStarted","Data":"d76b25f93d4b9eb6069f2b694f13e37095fbe28ef2b8aafb818c741ccbdda39f"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.696067 4974 generic.go:334] "Generic (PLEG): container finished" podID="e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74" containerID="17682f0d0f273e89d0ffd6c3ba9d0bda6d60aaa973939c113a2fb1fd3d73636b" exitCode=0 Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.696899 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" event={"ID":"e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74","Type":"ContainerDied","Data":"17682f0d0f273e89d0ffd6c3ba9d0bda6d60aaa973939c113a2fb1fd3d73636b"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.733383 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fgm8n" event={"ID":"53c8c837-3981-467c-92ae-2f0fd6cfbb48","Type":"ContainerStarted","Data":"f572b73cdbf80337bcd16511ed1707811c1399968cbe548065a98a0f5b19a562"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.733701 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fgm8n" event={"ID":"53c8c837-3981-467c-92ae-2f0fd6cfbb48","Type":"ContainerStarted","Data":"d4be910d82c6a09db3ca8d0232e91ad731141cf78c7c37cc1a0f63397bedc50a"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.755273 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk" event={"ID":"78a93fc9-5305-44ea-a573-3e54bd52f22d","Type":"ContainerStarted","Data":"d3a852d6427a07a9d9ca87013805097ad1ca43615512a3d70893001ee32ba7db"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.755332 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk" event={"ID":"78a93fc9-5305-44ea-a573-3e54bd52f22d","Type":"ContainerStarted","Data":"d581d57cdd437df875e06b0954fa8ecd9fb2f6e8a756863ebd93c41cb36fc00e"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.757389 4974 patch_prober.go:28] interesting pod/apiserver-76f77b778f-mlpn9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Oct 13 18:16:51 crc kubenswrapper[4974]: [+]log ok Oct 13 18:16:51 crc kubenswrapper[4974]: [+]etcd ok Oct 13 18:16:51 crc kubenswrapper[4974]: [+]poststarthook/start-apiserver-admission-initializer ok Oct 13 18:16:51 crc kubenswrapper[4974]: [+]poststarthook/generic-apiserver-start-informers ok Oct 13 18:16:51 crc kubenswrapper[4974]: [+]poststarthook/max-in-flight-filter ok Oct 13 18:16:51 crc kubenswrapper[4974]: [+]poststarthook/storage-object-count-tracker-hook ok Oct 13 18:16:51 crc kubenswrapper[4974]: [+]poststarthook/image.openshift.io-apiserver-caches ok Oct 13 18:16:51 crc kubenswrapper[4974]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Oct 13 18:16:51 crc kubenswrapper[4974]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Oct 13 18:16:51 crc kubenswrapper[4974]: [+]poststarthook/project.openshift.io-projectcache ok Oct 13 18:16:51 crc kubenswrapper[4974]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Oct 13 18:16:51 crc kubenswrapper[4974]: [+]poststarthook/openshift.io-startinformers ok Oct 13 18:16:51 crc kubenswrapper[4974]: [+]poststarthook/openshift.io-restmapperupdater ok Oct 13 18:16:51 crc kubenswrapper[4974]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Oct 13 18:16:51 crc kubenswrapper[4974]: livez check failed Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.757897 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" podUID="f8a0815a-ba28-4e47-ba48-2b6e9270a3d8" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.758589 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:51 crc kubenswrapper[4974]: E1013 18:16:51.762179 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:52.262158794 +0000 UTC m=+147.166524874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.859463 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-h6bwk" podStartSLOduration=124.859446984 podStartE2EDuration="2m4.859446984s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:51.859193297 +0000 UTC m=+146.763559377" watchObservedRunningTime="2025-10-13 18:16:51.859446984 +0000 UTC m=+146.763813064" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.866407 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:51 crc kubenswrapper[4974]: E1013 18:16:51.867362 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:52.367348605 +0000 UTC m=+147.271714675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.884298 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dsjgz" event={"ID":"57abf354-bc90-422f-a35d-c0185fa85ca1","Type":"ContainerStarted","Data":"01fff374b884a4742fb7edcc45ebc9890347384b5d660043417c8028b3804cc6"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.890432 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ng69k" event={"ID":"78b85293-78d3-47e8-8546-7d625828b45a","Type":"ContainerStarted","Data":"1dbe9069f404c98f668b879afef8953399ef758686b49778a47d78d15815cbf0"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.921460 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-dsjgz" podStartSLOduration=124.921443373 podStartE2EDuration="2m4.921443373s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:51.921232917 +0000 UTC m=+146.825598997" watchObservedRunningTime="2025-10-13 18:16:51.921443373 +0000 UTC m=+146.825809443" Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.936967 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" event={"ID":"b317e9fb-3b83-47f2-971b-7e509a59782f","Type":"ContainerStarted","Data":"b033f8cc026962991987cac81740d09fe7ab1890ed064fa3f72dc0d612d097d3"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.937020 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" event={"ID":"b317e9fb-3b83-47f2-971b-7e509a59782f","Type":"ContainerStarted","Data":"f0674c6de0d46f24d53bc64537a2424a7a088f3dab52e24a72840ee897e8b06e"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.962617 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4r245" event={"ID":"77988f90-8a72-4a03-9394-9f1971c21484","Type":"ContainerStarted","Data":"f7b292c5ba127cf07fafca0286747bf2a2955bd178636b4c9f6cf7bba16b8edf"} Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.972511 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:51 crc kubenswrapper[4974]: E1013 18:16:51.973799 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:52.473781032 +0000 UTC m=+147.378147102 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:51 crc kubenswrapper[4974]: I1013 18:16:51.988024 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" event={"ID":"9041feac-829c-4d87-ade9-e09fd97b46ba","Type":"ContainerStarted","Data":"f6ad5204274a6b20f02497bed2cbe69d05bc51f114c12669a4983e55c7847541"} Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.007805 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" event={"ID":"487f3a34-1886-4b8f-a9c3-b1d4f7702de2","Type":"ContainerStarted","Data":"11726fa9ecb0dca566ebadef151674408f9b6817f244787f3301dc439735d4fa"} Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.085781 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:52 crc kubenswrapper[4974]: E1013 18:16:52.086364 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:52.58634564 +0000 UTC m=+147.490711720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.188091 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:52 crc kubenswrapper[4974]: E1013 18:16:52.188794 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:52.688778524 +0000 UTC m=+147.593144604 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.283499 4974 patch_prober.go:28] interesting pod/router-default-5444994796-9p2fb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 18:16:52 crc kubenswrapper[4974]: [-]has-synced failed: reason withheld Oct 13 18:16:52 crc kubenswrapper[4974]: [+]process-running ok Oct 13 18:16:52 crc kubenswrapper[4974]: healthz check failed Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.283848 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9p2fb" podUID="e713c27c-e93f-44ce-8e9a-d908099b699c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.299288 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:52 crc kubenswrapper[4974]: E1013 18:16:52.299665 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:52.799649365 +0000 UTC m=+147.704015445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.400795 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:52 crc kubenswrapper[4974]: E1013 18:16:52.401121 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:52.901098282 +0000 UTC m=+147.805464362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.502639 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:52 crc kubenswrapper[4974]: E1013 18:16:52.503298 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.003273819 +0000 UTC m=+147.907639899 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.603697 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:52 crc kubenswrapper[4974]: E1013 18:16:52.603832 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.10380502 +0000 UTC m=+148.008171100 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.604029 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:52 crc kubenswrapper[4974]: E1013 18:16:52.604365 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.104351565 +0000 UTC m=+148.008717645 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.705698 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:52 crc kubenswrapper[4974]: E1013 18:16:52.705857 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.205833133 +0000 UTC m=+148.110199213 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.706182 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:52 crc kubenswrapper[4974]: E1013 18:16:52.706505 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.206494161 +0000 UTC m=+148.110860241 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.807433 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:52 crc kubenswrapper[4974]: E1013 18:16:52.807866 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.307826215 +0000 UTC m=+148.212192295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.887040 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" podStartSLOduration=124.887023177 podStartE2EDuration="2m4.887023177s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:52.077682587 +0000 UTC m=+146.982048677" watchObservedRunningTime="2025-10-13 18:16:52.887023177 +0000 UTC m=+147.791389257" Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.888537 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.889468 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.892143 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.893262 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.909879 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:52 crc kubenswrapper[4974]: E1013 18:16:52.910178 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.410162466 +0000 UTC m=+148.314528556 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.941245 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.956615 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:52 crc kubenswrapper[4974]: I1013 18:16:52.956785 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.010485 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.010661 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ed10d46-3591-4e9c-bff7-e413bcdc8329-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3ed10d46-3591-4e9c-bff7-e413bcdc8329\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.010723 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ed10d46-3591-4e9c-bff7-e413bcdc8329-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3ed10d46-3591-4e9c-bff7-e413bcdc8329\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 18:16:53 crc kubenswrapper[4974]: E1013 18:16:53.010882 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.510866502 +0000 UTC m=+148.415232572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.015548 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-rdq7v" event={"ID":"487f3a34-1886-4b8f-a9c3-b1d4f7702de2","Type":"ContainerStarted","Data":"770f42fc517d6ab1c48b293966e442486fe1d16f3775b4ffd84e9ea4e1a13e30"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.017965 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fgm8n" event={"ID":"53c8c837-3981-467c-92ae-2f0fd6cfbb48","Type":"ContainerStarted","Data":"45dc9f939d623aed70b849f0bc143875020ff7ece25583d624b6c1f1b9900dd7"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.018441 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-fgm8n" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.019194 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" event={"ID":"7429c3e4-2ad8-4373-807a-b69a11868c49","Type":"ContainerStarted","Data":"88a68b5de9dd2fe749342e2a2bfa1c3733354311f4ea9f85bd3bcff1fa8c451e"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.021113 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kkk6m" event={"ID":"83624c06-f2ff-4706-b4cd-1d41edb91898","Type":"ContainerStarted","Data":"9339bc8833214b3a897ba19683d592eee932ce89d0f46e4cc9988222099a7634"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.024813 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" event={"ID":"e2e08fa4-7cd7-40fe-bbae-e7b1afe08b74","Type":"ContainerStarted","Data":"8f66d567c87ac63827a700c3fca4a8a0ce083a05c383d21b7ddf8ddef1f49c11"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.024876 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.030929 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-dsjgz" event={"ID":"57abf354-bc90-422f-a35d-c0185fa85ca1","Type":"ContainerStarted","Data":"f644b67c5f38bca5297b70b8ea0e4f7b0bf91875488b6d7e8c8d9e2c9766e937"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.032817 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" event={"ID":"b317e9fb-3b83-47f2-971b-7e509a59782f","Type":"ContainerStarted","Data":"8787b81ddb9edf91a4fc6507d02158a1eeb15a47e10fbffb71ff5899d8227016"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.034478 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" event={"ID":"ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1","Type":"ContainerStarted","Data":"dcba208f735e7cca56855947dc3b2f7bc95059f5c47d2412e5a41e9d5334636a"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.034565 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" event={"ID":"ae4d9fa3-0fe5-4b9f-96e7-654ccab6efe1","Type":"ContainerStarted","Data":"656f59b045c55990adc83766a0e7d786d8cea96451c134c7040c497efe48979f"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.035005 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.037056 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" event={"ID":"9041feac-829c-4d87-ade9-e09fd97b46ba","Type":"ContainerStarted","Data":"9ee4b1bb4effaa070365feddc0efc35b833413b961238b164aeffea962f518a9"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.037544 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.039464 4974 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q7r7s container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.039518 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" podUID="9041feac-829c-4d87-ade9-e09fd97b46ba" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.040181 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" event={"ID":"3593c403-0e2d-4cee-a2fa-8df43371df5a","Type":"ContainerStarted","Data":"893a679539b4a590cf87b8b6a19fd23b16354c9441ddb77f5ad2a0372aee88b3"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.040391 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.041789 4974 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-bhw5m container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.041863 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" podUID="3593c403-0e2d-4cee-a2fa-8df43371df5a" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.042394 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xfvq4" event={"ID":"53800734-bd36-4fd6-9ed9-f69208f973b2","Type":"ContainerStarted","Data":"ec43723216016a8450b20242fb3610257504c64956ba0becb6fd77e295d3df7e"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.046286 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fgm8n" podStartSLOduration=8.046273465 podStartE2EDuration="8.046273465s" podCreationTimestamp="2025-10-13 18:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.040927695 +0000 UTC m=+147.945293775" watchObservedRunningTime="2025-10-13 18:16:53.046273465 +0000 UTC m=+147.950639545" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.046839 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" event={"ID":"73685109-d1de-4e40-8bfe-3b817aaadabd","Type":"ContainerStarted","Data":"96b0734f26e3e8a3026b2843074b5326dc5fd9277a3b55647b724987659c4712"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.047250 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.048582 4974 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-9b6fg container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" start-of-body= Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.048715 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" podUID="73685109-d1de-4e40-8bfe-3b817aaadabd" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.052457 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" event={"ID":"11322ed1-eb0b-4d64-890d-2f4948c102a7","Type":"ContainerStarted","Data":"ba24c1f422bd2ea1aa14e79c5bbc5569a986af163f2df1ce3566f8b5fff7e937"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.055435 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" event={"ID":"b930522f-1b72-4d47-a50e-883ed2c2facb","Type":"ContainerStarted","Data":"4de97af7b63de31304bd955e777c3182ebe6ce2d368c22013a0793c1677647b8"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.056483 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" event={"ID":"18bd98c7-66c4-47fc-b2e4-b53134a191b8","Type":"ContainerStarted","Data":"01e5afdb3b181fb3e75c7796953a8a640b9e4db6270b29d38eb5472e3a7ebadb"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.057812 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ng69k" event={"ID":"78b85293-78d3-47e8-8546-7d625828b45a","Type":"ContainerStarted","Data":"8f1e096cb4620e392549140ea7931f2eae1a04f9429e1e7dbd735f177a5bee96"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.059919 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4r245" event={"ID":"77988f90-8a72-4a03-9394-9f1971c21484","Type":"ContainerStarted","Data":"2d398d18a7ae7789b01a39cfce225cebb33121d3ea4179060e3840f025769227"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.059973 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-4r245" event={"ID":"77988f90-8a72-4a03-9394-9f1971c21484","Type":"ContainerStarted","Data":"60fd21d23630c09006e7f03c1a9a0053261f13da51cfeafacaef61563fa67a48"} Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.060796 4974 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wph2s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.060838 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" podUID="fa74b416-3f9b-45de-a657-79a31f755b9c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.061232 4974 patch_prober.go:28] interesting pod/downloads-7954f5f757-rr898 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.061364 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rr898" podUID="34b64103-310c-4c28-80ca-decee688ee64" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.111701 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ed10d46-3591-4e9c-bff7-e413bcdc8329-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3ed10d46-3591-4e9c-bff7-e413bcdc8329\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.111790 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.111940 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ed10d46-3591-4e9c-bff7-e413bcdc8329-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3ed10d46-3591-4e9c-bff7-e413bcdc8329\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.113293 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ed10d46-3591-4e9c-bff7-e413bcdc8329-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"3ed10d46-3591-4e9c-bff7-e413bcdc8329\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 18:16:53 crc kubenswrapper[4974]: E1013 18:16:53.114534 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.61452038 +0000 UTC m=+148.518886460 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.129783 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-5mjnq" podStartSLOduration=125.129747167 podStartE2EDuration="2m5.129747167s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.116904537 +0000 UTC m=+148.021270617" watchObservedRunningTime="2025-10-13 18:16:53.129747167 +0000 UTC m=+148.034113247" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.129948 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" podStartSLOduration=113.129939803 podStartE2EDuration="1m53.129939803s" podCreationTimestamp="2025-10-13 18:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.082341627 +0000 UTC m=+147.986707717" watchObservedRunningTime="2025-10-13 18:16:53.129939803 +0000 UTC m=+148.034305883" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.141939 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ed10d46-3591-4e9c-bff7-e413bcdc8329-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"3ed10d46-3591-4e9c-bff7-e413bcdc8329\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.157962 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-kkk6m" podStartSLOduration=126.157943159 podStartE2EDuration="2m6.157943159s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.157410274 +0000 UTC m=+148.061776364" watchObservedRunningTime="2025-10-13 18:16:53.157943159 +0000 UTC m=+148.062309239" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.203995 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.208719 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-tp4tf" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.214066 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:53 crc kubenswrapper[4974]: E1013 18:16:53.214917 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.714889516 +0000 UTC m=+148.619255656 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.217940 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:53 crc kubenswrapper[4974]: E1013 18:16:53.219314 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.71929828 +0000 UTC m=+148.623664360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.235846 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" podStartSLOduration=126.235825784 podStartE2EDuration="2m6.235825784s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.196051468 +0000 UTC m=+148.100417558" watchObservedRunningTime="2025-10-13 18:16:53.235825784 +0000 UTC m=+148.140191864" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.236130 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" podStartSLOduration=125.236124512 podStartE2EDuration="2m5.236124512s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.234758844 +0000 UTC m=+148.139124924" watchObservedRunningTime="2025-10-13 18:16:53.236124512 +0000 UTC m=+148.140490582" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.277946 4974 patch_prober.go:28] interesting pod/router-default-5444994796-9p2fb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 18:16:53 crc kubenswrapper[4974]: [-]has-synced failed: reason withheld Oct 13 18:16:53 crc kubenswrapper[4974]: [+]process-running ok Oct 13 18:16:53 crc kubenswrapper[4974]: healthz check failed Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.278516 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9p2fb" podUID="e713c27c-e93f-44ce-8e9a-d908099b699c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.303547 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" podStartSLOduration=125.303527363 podStartE2EDuration="2m5.303527363s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.272290317 +0000 UTC m=+148.176656397" watchObservedRunningTime="2025-10-13 18:16:53.303527363 +0000 UTC m=+148.207893443" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.320716 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:53 crc kubenswrapper[4974]: E1013 18:16:53.321176 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.821158828 +0000 UTC m=+148.725524908 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.347291 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" podStartSLOduration=125.347272291 podStartE2EDuration="2m5.347272291s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.346961872 +0000 UTC m=+148.251327952" watchObservedRunningTime="2025-10-13 18:16:53.347272291 +0000 UTC m=+148.251638371" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.348138 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" podStartSLOduration=125.348132945 podStartE2EDuration="2m5.348132945s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.30590238 +0000 UTC m=+148.210268470" watchObservedRunningTime="2025-10-13 18:16:53.348132945 +0000 UTC m=+148.252499025" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.396415 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-lvmtc" podStartSLOduration=125.396396509 podStartE2EDuration="2m5.396396509s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.394975339 +0000 UTC m=+148.299341419" watchObservedRunningTime="2025-10-13 18:16:53.396396509 +0000 UTC m=+148.300762579" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.423722 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:53 crc kubenswrapper[4974]: E1013 18:16:53.424053 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:53.924041525 +0000 UTC m=+148.828407605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.436439 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-dwl4j" podStartSLOduration=126.436415682 podStartE2EDuration="2m6.436415682s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.434502779 +0000 UTC m=+148.338868859" watchObservedRunningTime="2025-10-13 18:16:53.436415682 +0000 UTC m=+148.340781762" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.490475 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xfvq4" podStartSLOduration=8.490454659 podStartE2EDuration="8.490454659s" podCreationTimestamp="2025-10-13 18:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.466529037 +0000 UTC m=+148.370895117" watchObservedRunningTime="2025-10-13 18:16:53.490454659 +0000 UTC m=+148.394820739" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.499990 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.506631 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-9mxlt" podStartSLOduration=126.506604162 podStartE2EDuration="2m6.506604162s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.488222146 +0000 UTC m=+148.392588226" watchObservedRunningTime="2025-10-13 18:16:53.506604162 +0000 UTC m=+148.410970242" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.525221 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:53 crc kubenswrapper[4974]: E1013 18:16:53.525523 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.025506412 +0000 UTC m=+148.929872492 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.530728 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-4r245" podStartSLOduration=125.530702758 podStartE2EDuration="2m5.530702758s" podCreationTimestamp="2025-10-13 18:14:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:53.508137655 +0000 UTC m=+148.412503735" watchObservedRunningTime="2025-10-13 18:16:53.530702758 +0000 UTC m=+148.435068838" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.626793 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:53 crc kubenswrapper[4974]: E1013 18:16:53.627386 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.12737535 +0000 UTC m=+149.031741430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.729177 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.729386 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.729430 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.729454 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.729516 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:53 crc kubenswrapper[4974]: E1013 18:16:53.730554 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.230538125 +0000 UTC m=+149.134904205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.735146 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.739824 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.745365 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.756324 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.830341 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:53 crc kubenswrapper[4974]: E1013 18:16:53.830712 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.330700546 +0000 UTC m=+149.235066626 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.833624 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.855785 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.863934 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.931750 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:53 crc kubenswrapper[4974]: E1013 18:16:53.932209 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.432180503 +0000 UTC m=+149.336546583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.932291 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:53 crc kubenswrapper[4974]: E1013 18:16:53.932602 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.432590904 +0000 UTC m=+149.336956984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:53 crc kubenswrapper[4974]: I1013 18:16:53.975511 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.033307 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:54 crc kubenswrapper[4974]: E1013 18:16:54.033476 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.533451365 +0000 UTC m=+149.437817445 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.033768 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:54 crc kubenswrapper[4974]: E1013 18:16:54.034073 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.534055922 +0000 UTC m=+149.438422002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.078352 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ed10d46-3591-4e9c-bff7-e413bcdc8329","Type":"ContainerStarted","Data":"263fe7c9dd32908c54c3d5f81fce3921165385bc9e837518fe255d1200cee229"} Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.079769 4974 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wph2s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" start-of-body= Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.079833 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" podUID="fa74b416-3f9b-45de-a657-79a31f755b9c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.38:8080/healthz\": dial tcp 10.217.0.38:8080: connect: connection refused" Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.082024 4974 patch_prober.go:28] interesting pod/downloads-7954f5f757-rr898 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.082072 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rr898" podUID="34b64103-310c-4c28-80ca-decee688ee64" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.102953 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q7r7s" Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.103210 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-sx8k7" Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.103256 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-9b6fg" Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.105231 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-bhw5m" Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.136782 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:54 crc kubenswrapper[4974]: E1013 18:16:54.137271 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.637246337 +0000 UTC m=+149.541612417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.239832 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:54 crc kubenswrapper[4974]: E1013 18:16:54.242303 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.742289494 +0000 UTC m=+149.646655574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.287292 4974 patch_prober.go:28] interesting pod/router-default-5444994796-9p2fb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 18:16:54 crc kubenswrapper[4974]: [-]has-synced failed: reason withheld Oct 13 18:16:54 crc kubenswrapper[4974]: [+]process-running ok Oct 13 18:16:54 crc kubenswrapper[4974]: healthz check failed Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.287576 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9p2fb" podUID="e713c27c-e93f-44ce-8e9a-d908099b699c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.341025 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:54 crc kubenswrapper[4974]: E1013 18:16:54.342163 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.842127676 +0000 UTC m=+149.746493756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.342559 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:54 crc kubenswrapper[4974]: E1013 18:16:54.342914 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.842906338 +0000 UTC m=+149.747272418 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.444506 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:54 crc kubenswrapper[4974]: E1013 18:16:54.444837 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:54.944822767 +0000 UTC m=+149.849188847 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.546248 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:54 crc kubenswrapper[4974]: E1013 18:16:54.546609 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:55.046593003 +0000 UTC m=+149.950959083 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.628056 4974 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.647912 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:54 crc kubenswrapper[4974]: E1013 18:16:54.648495 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:55.148475822 +0000 UTC m=+150.052841902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.749493 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:54 crc kubenswrapper[4974]: E1013 18:16:54.750109 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:55.250084773 +0000 UTC m=+150.154450853 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.851145 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:54 crc kubenswrapper[4974]: E1013 18:16:54.851442 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-10-13 18:16:55.351404576 +0000 UTC m=+150.255770646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:54 crc kubenswrapper[4974]: I1013 18:16:54.953023 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:54 crc kubenswrapper[4974]: E1013 18:16:54.953407 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-10-13 18:16:55.453391866 +0000 UTC m=+150.357757946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-nlffn" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.049712 4974 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-10-13T18:16:54.628082789Z","Handler":null,"Name":""} Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.052471 4974 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.052519 4974 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.053877 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.060876 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.061790 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.063763 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.064153 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.065068 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.071825 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.084800 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cde4fcb3c78ca584eb28571c152341d99223606ffd98a236671081cd66af34c7"} Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.084863 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"895341e35d82f6f0a481e2fae8d744d607b4af6e3e69bd04ab6e12ea337e7517"} Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.085073 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.085991 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"87d6b8f30336a2ffb89bd6af48513b03d44ad4c36911d4be58166cf7fa48153d"} Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.086041 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b9734dd78abff39362c88bca350e75095d7ddc31717181130f361b88e3db62cf"} Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.087616 4974 generic.go:334] "Generic (PLEG): container finished" podID="3ed10d46-3591-4e9c-bff7-e413bcdc8329" containerID="1541f8083f6dfda6b78b986db0ddf597b42982687659a69ac07f99f7caececbb" exitCode=0 Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.087689 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ed10d46-3591-4e9c-bff7-e413bcdc8329","Type":"ContainerDied","Data":"1541f8083f6dfda6b78b986db0ddf597b42982687659a69ac07f99f7caececbb"} Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.090424 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ng69k" event={"ID":"78b85293-78d3-47e8-8546-7d625828b45a","Type":"ContainerStarted","Data":"84ab8112142bdc3b4d6118a183a859aa81c78231fd74ac28b6ac072a8e4370c4"} Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.090501 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ng69k" event={"ID":"78b85293-78d3-47e8-8546-7d625828b45a","Type":"ContainerStarted","Data":"13546e871ee6a510eb783032a93a563947a1e084d943dada64baeba61a8050c9"} Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.090525 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-ng69k" event={"ID":"78b85293-78d3-47e8-8546-7d625828b45a","Type":"ContainerStarted","Data":"e61dd3f9679a822fb61e9b7c0d6038e930e36b4333e8565daf892b5e15e01a91"} Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.092928 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"742196ad59a580c2e87c8d2e887f71327d9626413c4c430e3e1ed97048f4163b"} Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.092984 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9b209c60c83f875a9604074549a90d2624a745cb64ab58af4a3cab77fdb688ef"} Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.154916 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.155017 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7572ef96-dbc0-4521-98be-69c2b27d869d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7572ef96-dbc0-4521-98be-69c2b27d869d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.155092 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7572ef96-dbc0-4521-98be-69c2b27d869d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7572ef96-dbc0-4521-98be-69c2b27d869d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.157391 4974 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.157447 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.185605 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-ng69k" podStartSLOduration=10.185579671 podStartE2EDuration="10.185579671s" podCreationTimestamp="2025-10-13 18:16:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:55.181000363 +0000 UTC m=+150.085366453" watchObservedRunningTime="2025-10-13 18:16:55.185579671 +0000 UTC m=+150.089945751" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.231636 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-nlffn\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.256833 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7572ef96-dbc0-4521-98be-69c2b27d869d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7572ef96-dbc0-4521-98be-69c2b27d869d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.257412 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7572ef96-dbc0-4521-98be-69c2b27d869d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7572ef96-dbc0-4521-98be-69c2b27d869d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.260038 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7572ef96-dbc0-4521-98be-69c2b27d869d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7572ef96-dbc0-4521-98be-69c2b27d869d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.274882 4974 patch_prober.go:28] interesting pod/router-default-5444994796-9p2fb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 18:16:55 crc kubenswrapper[4974]: [-]has-synced failed: reason withheld Oct 13 18:16:55 crc kubenswrapper[4974]: [+]process-running ok Oct 13 18:16:55 crc kubenswrapper[4974]: healthz check failed Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.274974 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9p2fb" podUID="e713c27c-e93f-44ce-8e9a-d908099b699c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.279814 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7572ef96-dbc0-4521-98be-69c2b27d869d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7572ef96-dbc0-4521-98be-69c2b27d869d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.310225 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bhwmv"] Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.311169 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.315680 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.322870 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhwmv"] Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.399970 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.461890 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/243c3013-e799-451c-82e3-05371075de32-utilities\") pod \"community-operators-bhwmv\" (UID: \"243c3013-e799-451c-82e3-05371075de32\") " pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.461991 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9mms\" (UniqueName: \"kubernetes.io/projected/243c3013-e799-451c-82e3-05371075de32-kube-api-access-d9mms\") pod \"community-operators-bhwmv\" (UID: \"243c3013-e799-451c-82e3-05371075de32\") " pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.462029 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/243c3013-e799-451c-82e3-05371075de32-catalog-content\") pod \"community-operators-bhwmv\" (UID: \"243c3013-e799-451c-82e3-05371075de32\") " pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.463397 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.515392 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pjz9z"] Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.516641 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.519377 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.533678 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pjz9z"] Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.563177 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/243c3013-e799-451c-82e3-05371075de32-catalog-content\") pod \"community-operators-bhwmv\" (UID: \"243c3013-e799-451c-82e3-05371075de32\") " pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.563631 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/243c3013-e799-451c-82e3-05371075de32-utilities\") pod \"community-operators-bhwmv\" (UID: \"243c3013-e799-451c-82e3-05371075de32\") " pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.563696 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9mms\" (UniqueName: \"kubernetes.io/projected/243c3013-e799-451c-82e3-05371075de32-kube-api-access-d9mms\") pod \"community-operators-bhwmv\" (UID: \"243c3013-e799-451c-82e3-05371075de32\") " pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.564465 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/243c3013-e799-451c-82e3-05371075de32-catalog-content\") pod \"community-operators-bhwmv\" (UID: \"243c3013-e799-451c-82e3-05371075de32\") " pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.564523 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/243c3013-e799-451c-82e3-05371075de32-utilities\") pod \"community-operators-bhwmv\" (UID: \"243c3013-e799-451c-82e3-05371075de32\") " pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.599565 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9mms\" (UniqueName: \"kubernetes.io/projected/243c3013-e799-451c-82e3-05371075de32-kube-api-access-d9mms\") pod \"community-operators-bhwmv\" (UID: \"243c3013-e799-451c-82e3-05371075de32\") " pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.626070 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.664938 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-utilities\") pod \"certified-operators-pjz9z\" (UID: \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\") " pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.664992 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-catalog-content\") pod \"certified-operators-pjz9z\" (UID: \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\") " pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.665061 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7szjc\" (UniqueName: \"kubernetes.io/projected/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-kube-api-access-7szjc\") pod \"certified-operators-pjz9z\" (UID: \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\") " pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.684936 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Oct 13 18:16:55 crc kubenswrapper[4974]: W1013 18:16:55.690518 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod7572ef96_dbc0_4521_98be_69c2b27d869d.slice/crio-a80707e4d74ebdb6f4803fde4da666fd9fb7283026194f0a586dc192b600ffb0 WatchSource:0}: Error finding container a80707e4d74ebdb6f4803fde4da666fd9fb7283026194f0a586dc192b600ffb0: Status 404 returned error can't find the container with id a80707e4d74ebdb6f4803fde4da666fd9fb7283026194f0a586dc192b600ffb0 Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.715836 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8p68v"] Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.716749 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.736878 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p68v"] Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.771041 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7szjc\" (UniqueName: \"kubernetes.io/projected/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-kube-api-access-7szjc\") pod \"certified-operators-pjz9z\" (UID: \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\") " pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.771095 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-utilities\") pod \"certified-operators-pjz9z\" (UID: \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\") " pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.771135 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-catalog-content\") pod \"certified-operators-pjz9z\" (UID: \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\") " pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.771527 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-catalog-content\") pod \"certified-operators-pjz9z\" (UID: \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\") " pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.772058 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-utilities\") pod \"certified-operators-pjz9z\" (UID: \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\") " pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.794776 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7szjc\" (UniqueName: \"kubernetes.io/projected/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-kube-api-access-7szjc\") pod \"certified-operators-pjz9z\" (UID: \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\") " pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.800302 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nlffn"] Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.824163 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.848581 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.872554 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4353b611-898f-42b9-8bfd-927ca6579832-catalog-content\") pod \"community-operators-8p68v\" (UID: \"4353b611-898f-42b9-8bfd-927ca6579832\") " pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.872622 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4353b611-898f-42b9-8bfd-927ca6579832-utilities\") pod \"community-operators-8p68v\" (UID: \"4353b611-898f-42b9-8bfd-927ca6579832\") " pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.872680 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvfgz\" (UniqueName: \"kubernetes.io/projected/4353b611-898f-42b9-8bfd-927ca6579832-kube-api-access-xvfgz\") pod \"community-operators-8p68v\" (UID: \"4353b611-898f-42b9-8bfd-927ca6579832\") " pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.913995 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jv9kj"] Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.916484 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.920926 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jv9kj"] Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.973445 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4353b611-898f-42b9-8bfd-927ca6579832-catalog-content\") pod \"community-operators-8p68v\" (UID: \"4353b611-898f-42b9-8bfd-927ca6579832\") " pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.973518 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4353b611-898f-42b9-8bfd-927ca6579832-utilities\") pod \"community-operators-8p68v\" (UID: \"4353b611-898f-42b9-8bfd-927ca6579832\") " pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.973563 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvfgz\" (UniqueName: \"kubernetes.io/projected/4353b611-898f-42b9-8bfd-927ca6579832-kube-api-access-xvfgz\") pod \"community-operators-8p68v\" (UID: \"4353b611-898f-42b9-8bfd-927ca6579832\") " pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.974050 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4353b611-898f-42b9-8bfd-927ca6579832-catalog-content\") pod \"community-operators-8p68v\" (UID: \"4353b611-898f-42b9-8bfd-927ca6579832\") " pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.974066 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4353b611-898f-42b9-8bfd-927ca6579832-utilities\") pod \"community-operators-8p68v\" (UID: \"4353b611-898f-42b9-8bfd-927ca6579832\") " pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:16:55 crc kubenswrapper[4974]: I1013 18:16:55.989377 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvfgz\" (UniqueName: \"kubernetes.io/projected/4353b611-898f-42b9-8bfd-927ca6579832-kube-api-access-xvfgz\") pod \"community-operators-8p68v\" (UID: \"4353b611-898f-42b9-8bfd-927ca6579832\") " pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.057345 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.075015 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hsqw\" (UniqueName: \"kubernetes.io/projected/31f04939-3c35-4f5d-bea5-34b2c7c27d53-kube-api-access-8hsqw\") pod \"certified-operators-jv9kj\" (UID: \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\") " pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.075151 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31f04939-3c35-4f5d-bea5-34b2c7c27d53-catalog-content\") pod \"certified-operators-jv9kj\" (UID: \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\") " pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.075191 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31f04939-3c35-4f5d-bea5-34b2c7c27d53-utilities\") pod \"certified-operators-jv9kj\" (UID: \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\") " pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.111340 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7572ef96-dbc0-4521-98be-69c2b27d869d","Type":"ContainerStarted","Data":"a80707e4d74ebdb6f4803fde4da666fd9fb7283026194f0a586dc192b600ffb0"} Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.119131 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" event={"ID":"40c4254f-5c3d-4655-82f9-49fd9510339a","Type":"ContainerStarted","Data":"9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4"} Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.119160 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" event={"ID":"40c4254f-5c3d-4655-82f9-49fd9510339a","Type":"ContainerStarted","Data":"4692638061a09b175671fb6061fcdaddee4e6c2ca10502b852f83d6d1b33b67c"} Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.120367 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.130430 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pjz9z"] Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.134206 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bhwmv"] Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.152676 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" podStartSLOduration=129.152650856 podStartE2EDuration="2m9.152650856s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:16:56.150233498 +0000 UTC m=+151.054599578" watchObservedRunningTime="2025-10-13 18:16:56.152650856 +0000 UTC m=+151.057016936" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.175965 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31f04939-3c35-4f5d-bea5-34b2c7c27d53-catalog-content\") pod \"certified-operators-jv9kj\" (UID: \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\") " pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.176018 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31f04939-3c35-4f5d-bea5-34b2c7c27d53-utilities\") pod \"certified-operators-jv9kj\" (UID: \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\") " pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.176061 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hsqw\" (UniqueName: \"kubernetes.io/projected/31f04939-3c35-4f5d-bea5-34b2c7c27d53-kube-api-access-8hsqw\") pod \"certified-operators-jv9kj\" (UID: \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\") " pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.176443 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31f04939-3c35-4f5d-bea5-34b2c7c27d53-catalog-content\") pod \"certified-operators-jv9kj\" (UID: \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\") " pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.176532 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31f04939-3c35-4f5d-bea5-34b2c7c27d53-utilities\") pod \"certified-operators-jv9kj\" (UID: \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\") " pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.214030 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hsqw\" (UniqueName: \"kubernetes.io/projected/31f04939-3c35-4f5d-bea5-34b2c7c27d53-kube-api-access-8hsqw\") pod \"certified-operators-jv9kj\" (UID: \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\") " pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.238431 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.251071 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.251150 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.267583 4974 patch_prober.go:28] interesting pod/console-f9d7485db-jbznq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.267723 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jbznq" podUID="ce0c606d-4062-4f6a-afec-752440b5580c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.275643 4974 patch_prober.go:28] interesting pod/router-default-5444994796-9p2fb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 18:16:56 crc kubenswrapper[4974]: [-]has-synced failed: reason withheld Oct 13 18:16:56 crc kubenswrapper[4974]: [+]process-running ok Oct 13 18:16:56 crc kubenswrapper[4974]: healthz check failed Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.275721 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9p2fb" podUID="e713c27c-e93f-44ce-8e9a-d908099b699c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.381929 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p68v"] Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.382990 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.481311 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ed10d46-3591-4e9c-bff7-e413bcdc8329-kube-api-access\") pod \"3ed10d46-3591-4e9c-bff7-e413bcdc8329\" (UID: \"3ed10d46-3591-4e9c-bff7-e413bcdc8329\") " Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.481365 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ed10d46-3591-4e9c-bff7-e413bcdc8329-kubelet-dir\") pod \"3ed10d46-3591-4e9c-bff7-e413bcdc8329\" (UID: \"3ed10d46-3591-4e9c-bff7-e413bcdc8329\") " Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.481818 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ed10d46-3591-4e9c-bff7-e413bcdc8329-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3ed10d46-3591-4e9c-bff7-e413bcdc8329" (UID: "3ed10d46-3591-4e9c-bff7-e413bcdc8329"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.502829 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed10d46-3591-4e9c-bff7-e413bcdc8329-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3ed10d46-3591-4e9c-bff7-e413bcdc8329" (UID: "3ed10d46-3591-4e9c-bff7-e413bcdc8329"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.513017 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.526541 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-mlpn9" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.583859 4974 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3ed10d46-3591-4e9c-bff7-e413bcdc8329-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.583900 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3ed10d46-3591-4e9c-bff7-e413bcdc8329-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 13 18:16:56 crc kubenswrapper[4974]: I1013 18:16:56.628747 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jv9kj"] Oct 13 18:16:56 crc kubenswrapper[4974]: E1013 18:16:56.895253 4974 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod7572ef96_dbc0_4521_98be_69c2b27d869d.slice/crio-15510aecaab77da1bf54f172693d0b96102e486c3901afcc2d4f7cb43e2979ca.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod7572ef96_dbc0_4521_98be_69c2b27d869d.slice/crio-conmon-15510aecaab77da1bf54f172693d0b96102e486c3901afcc2d4f7cb43e2979ca.scope\": RecentStats: unable to find data in memory cache]" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.126740 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"3ed10d46-3591-4e9c-bff7-e413bcdc8329","Type":"ContainerDied","Data":"263fe7c9dd32908c54c3d5f81fce3921165385bc9e837518fe255d1200cee229"} Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.127061 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="263fe7c9dd32908c54c3d5f81fce3921165385bc9e837518fe255d1200cee229" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.127345 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.141164 4974 generic.go:334] "Generic (PLEG): container finished" podID="eb659196-3de8-42d4-9fe6-b0c4c5e4de13" containerID="207572889f0da4448c2ccefc0f4d8f68bf81fe05a38540290031c741c1f88c3a" exitCode=0 Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.141460 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjz9z" event={"ID":"eb659196-3de8-42d4-9fe6-b0c4c5e4de13","Type":"ContainerDied","Data":"207572889f0da4448c2ccefc0f4d8f68bf81fe05a38540290031c741c1f88c3a"} Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.141738 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjz9z" event={"ID":"eb659196-3de8-42d4-9fe6-b0c4c5e4de13","Type":"ContainerStarted","Data":"5d44a53cbfad40a83a03a35b6397149c8ab8fc55f7fb8c6017bbe877498f3d87"} Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.143700 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.144484 4974 generic.go:334] "Generic (PLEG): container finished" podID="31f04939-3c35-4f5d-bea5-34b2c7c27d53" containerID="5ee5d21eaecddb3ebd4f193eeee9551b9356abfea32c310eeb2e812af5fe5770" exitCode=0 Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.144584 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv9kj" event={"ID":"31f04939-3c35-4f5d-bea5-34b2c7c27d53","Type":"ContainerDied","Data":"5ee5d21eaecddb3ebd4f193eeee9551b9356abfea32c310eeb2e812af5fe5770"} Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.144626 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv9kj" event={"ID":"31f04939-3c35-4f5d-bea5-34b2c7c27d53","Type":"ContainerStarted","Data":"d1f25bdd8a2250dbeffea83ffa64cd0c0a3b183cf2521b5a3d6945e242acc917"} Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.148547 4974 generic.go:334] "Generic (PLEG): container finished" podID="7572ef96-dbc0-4521-98be-69c2b27d869d" containerID="15510aecaab77da1bf54f172693d0b96102e486c3901afcc2d4f7cb43e2979ca" exitCode=0 Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.148624 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7572ef96-dbc0-4521-98be-69c2b27d869d","Type":"ContainerDied","Data":"15510aecaab77da1bf54f172693d0b96102e486c3901afcc2d4f7cb43e2979ca"} Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.153906 4974 generic.go:334] "Generic (PLEG): container finished" podID="243c3013-e799-451c-82e3-05371075de32" containerID="1a631dee6767222409c0528a36c5545a735c4d472245b638e7e3a8beadb0b349" exitCode=0 Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.153988 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhwmv" event={"ID":"243c3013-e799-451c-82e3-05371075de32","Type":"ContainerDied","Data":"1a631dee6767222409c0528a36c5545a735c4d472245b638e7e3a8beadb0b349"} Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.154020 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhwmv" event={"ID":"243c3013-e799-451c-82e3-05371075de32","Type":"ContainerStarted","Data":"015b3b4ea0bcb911f376d561b6de7dff7c70484d437bf83549ec3600358030aa"} Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.156467 4974 generic.go:334] "Generic (PLEG): container finished" podID="4353b611-898f-42b9-8bfd-927ca6579832" containerID="1b4b815dbec4de311634a2e0ddd29e37cf861b97b3c233b9be59ca7eb03449cb" exitCode=0 Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.157892 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p68v" event={"ID":"4353b611-898f-42b9-8bfd-927ca6579832","Type":"ContainerDied","Data":"1b4b815dbec4de311634a2e0ddd29e37cf861b97b3c233b9be59ca7eb03449cb"} Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.157982 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p68v" event={"ID":"4353b611-898f-42b9-8bfd-927ca6579832","Type":"ContainerStarted","Data":"fdcd261609a62381489edbf73261bc82285e8a956c03e0fc5db3f0d6b5637f38"} Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.192989 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-nrzsg" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.274730 4974 patch_prober.go:28] interesting pod/router-default-5444994796-9p2fb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 18:16:57 crc kubenswrapper[4974]: [-]has-synced failed: reason withheld Oct 13 18:16:57 crc kubenswrapper[4974]: [+]process-running ok Oct 13 18:16:57 crc kubenswrapper[4974]: healthz check failed Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.274798 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9p2fb" podUID="e713c27c-e93f-44ce-8e9a-d908099b699c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.321608 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fp9tn"] Oct 13 18:16:57 crc kubenswrapper[4974]: E1013 18:16:57.321865 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed10d46-3591-4e9c-bff7-e413bcdc8329" containerName="pruner" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.321878 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed10d46-3591-4e9c-bff7-e413bcdc8329" containerName="pruner" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.321969 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed10d46-3591-4e9c-bff7-e413bcdc8329" containerName="pruner" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.322702 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.325361 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.347499 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp9tn"] Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.399386 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e79aa334-618e-4e55-9113-8627891e2962-catalog-content\") pod \"redhat-marketplace-fp9tn\" (UID: \"e79aa334-618e-4e55-9113-8627891e2962\") " pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.399451 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e79aa334-618e-4e55-9113-8627891e2962-utilities\") pod \"redhat-marketplace-fp9tn\" (UID: \"e79aa334-618e-4e55-9113-8627891e2962\") " pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.399505 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmtvc\" (UniqueName: \"kubernetes.io/projected/e79aa334-618e-4e55-9113-8627891e2962-kube-api-access-jmtvc\") pod \"redhat-marketplace-fp9tn\" (UID: \"e79aa334-618e-4e55-9113-8627891e2962\") " pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.500747 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e79aa334-618e-4e55-9113-8627891e2962-catalog-content\") pod \"redhat-marketplace-fp9tn\" (UID: \"e79aa334-618e-4e55-9113-8627891e2962\") " pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.501246 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e79aa334-618e-4e55-9113-8627891e2962-utilities\") pod \"redhat-marketplace-fp9tn\" (UID: \"e79aa334-618e-4e55-9113-8627891e2962\") " pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.501305 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmtvc\" (UniqueName: \"kubernetes.io/projected/e79aa334-618e-4e55-9113-8627891e2962-kube-api-access-jmtvc\") pod \"redhat-marketplace-fp9tn\" (UID: \"e79aa334-618e-4e55-9113-8627891e2962\") " pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.501481 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e79aa334-618e-4e55-9113-8627891e2962-catalog-content\") pod \"redhat-marketplace-fp9tn\" (UID: \"e79aa334-618e-4e55-9113-8627891e2962\") " pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.501762 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e79aa334-618e-4e55-9113-8627891e2962-utilities\") pod \"redhat-marketplace-fp9tn\" (UID: \"e79aa334-618e-4e55-9113-8627891e2962\") " pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.530615 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmtvc\" (UniqueName: \"kubernetes.io/projected/e79aa334-618e-4e55-9113-8627891e2962-kube-api-access-jmtvc\") pod \"redhat-marketplace-fp9tn\" (UID: \"e79aa334-618e-4e55-9113-8627891e2962\") " pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.636114 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.731371 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zwj6w"] Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.733293 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.736905 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwj6w"] Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.805446 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b122347-62b2-422f-96d0-7e725a331e1f-utilities\") pod \"redhat-marketplace-zwj6w\" (UID: \"4b122347-62b2-422f-96d0-7e725a331e1f\") " pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.805576 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b122347-62b2-422f-96d0-7e725a331e1f-catalog-content\") pod \"redhat-marketplace-zwj6w\" (UID: \"4b122347-62b2-422f-96d0-7e725a331e1f\") " pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.805634 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcxv4\" (UniqueName: \"kubernetes.io/projected/4b122347-62b2-422f-96d0-7e725a331e1f-kube-api-access-fcxv4\") pod \"redhat-marketplace-zwj6w\" (UID: \"4b122347-62b2-422f-96d0-7e725a331e1f\") " pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.906789 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcxv4\" (UniqueName: \"kubernetes.io/projected/4b122347-62b2-422f-96d0-7e725a331e1f-kube-api-access-fcxv4\") pod \"redhat-marketplace-zwj6w\" (UID: \"4b122347-62b2-422f-96d0-7e725a331e1f\") " pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.907369 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b122347-62b2-422f-96d0-7e725a331e1f-utilities\") pod \"redhat-marketplace-zwj6w\" (UID: \"4b122347-62b2-422f-96d0-7e725a331e1f\") " pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.907450 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b122347-62b2-422f-96d0-7e725a331e1f-catalog-content\") pod \"redhat-marketplace-zwj6w\" (UID: \"4b122347-62b2-422f-96d0-7e725a331e1f\") " pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.908388 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b122347-62b2-422f-96d0-7e725a331e1f-catalog-content\") pod \"redhat-marketplace-zwj6w\" (UID: \"4b122347-62b2-422f-96d0-7e725a331e1f\") " pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.908598 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b122347-62b2-422f-96d0-7e725a331e1f-utilities\") pod \"redhat-marketplace-zwj6w\" (UID: \"4b122347-62b2-422f-96d0-7e725a331e1f\") " pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.932342 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcxv4\" (UniqueName: \"kubernetes.io/projected/4b122347-62b2-422f-96d0-7e725a331e1f-kube-api-access-fcxv4\") pod \"redhat-marketplace-zwj6w\" (UID: \"4b122347-62b2-422f-96d0-7e725a331e1f\") " pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:16:57 crc kubenswrapper[4974]: I1013 18:16:57.987849 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp9tn"] Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.062615 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.164895 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp9tn" event={"ID":"e79aa334-618e-4e55-9113-8627891e2962","Type":"ContainerStarted","Data":"9434388f062e3409744a9fbf047f279cb20c127b4b13363cdb79e723412c17a2"} Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.164946 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp9tn" event={"ID":"e79aa334-618e-4e55-9113-8627891e2962","Type":"ContainerStarted","Data":"6230580fad5a03a0442f5dfe3cb4f109b6e8d34c26cf940559252a75db279e3d"} Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.214809 4974 patch_prober.go:28] interesting pod/downloads-7954f5f757-rr898 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.214869 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-rr898" podUID="34b64103-310c-4c28-80ca-decee688ee64" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.214879 4974 patch_prober.go:28] interesting pod/downloads-7954f5f757-rr898 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.214927 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-rr898" podUID="34b64103-310c-4c28-80ca-decee688ee64" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.272049 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.277200 4974 patch_prober.go:28] interesting pod/router-default-5444994796-9p2fb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 18:16:58 crc kubenswrapper[4974]: [-]has-synced failed: reason withheld Oct 13 18:16:58 crc kubenswrapper[4974]: [+]process-running ok Oct 13 18:16:58 crc kubenswrapper[4974]: healthz check failed Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.277266 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9p2fb" podUID="e713c27c-e93f-44ce-8e9a-d908099b699c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.378706 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.441064 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.516290 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7572ef96-dbc0-4521-98be-69c2b27d869d-kubelet-dir\") pod \"7572ef96-dbc0-4521-98be-69c2b27d869d\" (UID: \"7572ef96-dbc0-4521-98be-69c2b27d869d\") " Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.516363 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7572ef96-dbc0-4521-98be-69c2b27d869d-kube-api-access\") pod \"7572ef96-dbc0-4521-98be-69c2b27d869d\" (UID: \"7572ef96-dbc0-4521-98be-69c2b27d869d\") " Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.517141 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7572ef96-dbc0-4521-98be-69c2b27d869d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7572ef96-dbc0-4521-98be-69c2b27d869d" (UID: "7572ef96-dbc0-4521-98be-69c2b27d869d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.521886 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kwtp5"] Oct 13 18:16:58 crc kubenswrapper[4974]: E1013 18:16:58.522169 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7572ef96-dbc0-4521-98be-69c2b27d869d" containerName="pruner" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.522186 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7572ef96-dbc0-4521-98be-69c2b27d869d" containerName="pruner" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.522328 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="7572ef96-dbc0-4521-98be-69c2b27d869d" containerName="pruner" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.523205 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.528964 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7572ef96-dbc0-4521-98be-69c2b27d869d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7572ef96-dbc0-4521-98be-69c2b27d869d" (UID: "7572ef96-dbc0-4521-98be-69c2b27d869d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.529754 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.533207 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwtp5"] Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.583072 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwj6w"] Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.618140 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d3b1917-2acd-461e-b659-6056041ee467-catalog-content\") pod \"redhat-operators-kwtp5\" (UID: \"8d3b1917-2acd-461e-b659-6056041ee467\") " pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.618204 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d3b1917-2acd-461e-b659-6056041ee467-utilities\") pod \"redhat-operators-kwtp5\" (UID: \"8d3b1917-2acd-461e-b659-6056041ee467\") " pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.618377 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwczn\" (UniqueName: \"kubernetes.io/projected/8d3b1917-2acd-461e-b659-6056041ee467-kube-api-access-gwczn\") pod \"redhat-operators-kwtp5\" (UID: \"8d3b1917-2acd-461e-b659-6056041ee467\") " pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.618443 4974 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7572ef96-dbc0-4521-98be-69c2b27d869d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.618455 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7572ef96-dbc0-4521-98be-69c2b27d869d-kube-api-access\") on node \"crc\" DevicePath \"\"" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.719177 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwczn\" (UniqueName: \"kubernetes.io/projected/8d3b1917-2acd-461e-b659-6056041ee467-kube-api-access-gwczn\") pod \"redhat-operators-kwtp5\" (UID: \"8d3b1917-2acd-461e-b659-6056041ee467\") " pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.719884 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d3b1917-2acd-461e-b659-6056041ee467-catalog-content\") pod \"redhat-operators-kwtp5\" (UID: \"8d3b1917-2acd-461e-b659-6056041ee467\") " pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.719914 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d3b1917-2acd-461e-b659-6056041ee467-utilities\") pod \"redhat-operators-kwtp5\" (UID: \"8d3b1917-2acd-461e-b659-6056041ee467\") " pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.720344 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d3b1917-2acd-461e-b659-6056041ee467-utilities\") pod \"redhat-operators-kwtp5\" (UID: \"8d3b1917-2acd-461e-b659-6056041ee467\") " pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.720608 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d3b1917-2acd-461e-b659-6056041ee467-catalog-content\") pod \"redhat-operators-kwtp5\" (UID: \"8d3b1917-2acd-461e-b659-6056041ee467\") " pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.762771 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwczn\" (UniqueName: \"kubernetes.io/projected/8d3b1917-2acd-461e-b659-6056041ee467-kube-api-access-gwczn\") pod \"redhat-operators-kwtp5\" (UID: \"8d3b1917-2acd-461e-b659-6056041ee467\") " pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.870285 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.920122 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9zxdh"] Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.921172 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:16:58 crc kubenswrapper[4974]: I1013 18:16:58.931415 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9zxdh"] Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.025974 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d40e6084-5cc0-470e-866b-5c1de19b6875-catalog-content\") pod \"redhat-operators-9zxdh\" (UID: \"d40e6084-5cc0-470e-866b-5c1de19b6875\") " pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.026019 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjnxd\" (UniqueName: \"kubernetes.io/projected/d40e6084-5cc0-470e-866b-5c1de19b6875-kube-api-access-jjnxd\") pod \"redhat-operators-9zxdh\" (UID: \"d40e6084-5cc0-470e-866b-5c1de19b6875\") " pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.026263 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d40e6084-5cc0-470e-866b-5c1de19b6875-utilities\") pod \"redhat-operators-9zxdh\" (UID: \"d40e6084-5cc0-470e-866b-5c1de19b6875\") " pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.127712 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d40e6084-5cc0-470e-866b-5c1de19b6875-utilities\") pod \"redhat-operators-9zxdh\" (UID: \"d40e6084-5cc0-470e-866b-5c1de19b6875\") " pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.128043 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d40e6084-5cc0-470e-866b-5c1de19b6875-catalog-content\") pod \"redhat-operators-9zxdh\" (UID: \"d40e6084-5cc0-470e-866b-5c1de19b6875\") " pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.128062 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjnxd\" (UniqueName: \"kubernetes.io/projected/d40e6084-5cc0-470e-866b-5c1de19b6875-kube-api-access-jjnxd\") pod \"redhat-operators-9zxdh\" (UID: \"d40e6084-5cc0-470e-866b-5c1de19b6875\") " pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.128564 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d40e6084-5cc0-470e-866b-5c1de19b6875-catalog-content\") pod \"redhat-operators-9zxdh\" (UID: \"d40e6084-5cc0-470e-866b-5c1de19b6875\") " pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.128726 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d40e6084-5cc0-470e-866b-5c1de19b6875-utilities\") pod \"redhat-operators-9zxdh\" (UID: \"d40e6084-5cc0-470e-866b-5c1de19b6875\") " pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.146019 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjnxd\" (UniqueName: \"kubernetes.io/projected/d40e6084-5cc0-470e-866b-5c1de19b6875-kube-api-access-jjnxd\") pod \"redhat-operators-9zxdh\" (UID: \"d40e6084-5cc0-470e-866b-5c1de19b6875\") " pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.179867 4974 generic.go:334] "Generic (PLEG): container finished" podID="7429c3e4-2ad8-4373-807a-b69a11868c49" containerID="88a68b5de9dd2fe749342e2a2bfa1c3733354311f4ea9f85bd3bcff1fa8c451e" exitCode=0 Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.179958 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" event={"ID":"7429c3e4-2ad8-4373-807a-b69a11868c49","Type":"ContainerDied","Data":"88a68b5de9dd2fe749342e2a2bfa1c3733354311f4ea9f85bd3bcff1fa8c451e"} Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.182274 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7572ef96-dbc0-4521-98be-69c2b27d869d","Type":"ContainerDied","Data":"a80707e4d74ebdb6f4803fde4da666fd9fb7283026194f0a586dc192b600ffb0"} Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.182306 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a80707e4d74ebdb6f4803fde4da666fd9fb7283026194f0a586dc192b600ffb0" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.182382 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.184067 4974 generic.go:334] "Generic (PLEG): container finished" podID="4b122347-62b2-422f-96d0-7e725a331e1f" containerID="f90df1b0c9bcea92c397cb468ed8a9d7457af15500766c6d0b3878acb81a6f94" exitCode=0 Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.184133 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwj6w" event={"ID":"4b122347-62b2-422f-96d0-7e725a331e1f","Type":"ContainerDied","Data":"f90df1b0c9bcea92c397cb468ed8a9d7457af15500766c6d0b3878acb81a6f94"} Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.184159 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwj6w" event={"ID":"4b122347-62b2-422f-96d0-7e725a331e1f","Type":"ContainerStarted","Data":"adbc19e3a968ff3fc7fce0040de75f2bf8870748f135f9593ba4cea8c89e3edf"} Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.186576 4974 generic.go:334] "Generic (PLEG): container finished" podID="e79aa334-618e-4e55-9113-8627891e2962" containerID="9434388f062e3409744a9fbf047f279cb20c127b4b13363cdb79e723412c17a2" exitCode=0 Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.186618 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp9tn" event={"ID":"e79aa334-618e-4e55-9113-8627891e2962","Type":"ContainerDied","Data":"9434388f062e3409744a9fbf047f279cb20c127b4b13363cdb79e723412c17a2"} Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.255358 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.282572 4974 patch_prober.go:28] interesting pod/router-default-5444994796-9p2fb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 18:16:59 crc kubenswrapper[4974]: [-]has-synced failed: reason withheld Oct 13 18:16:59 crc kubenswrapper[4974]: [+]process-running ok Oct 13 18:16:59 crc kubenswrapper[4974]: healthz check failed Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.282726 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9p2fb" podUID="e713c27c-e93f-44ce-8e9a-d908099b699c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.482387 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kwtp5"] Oct 13 18:16:59 crc kubenswrapper[4974]: I1013 18:16:59.587346 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9zxdh"] Oct 13 18:16:59 crc kubenswrapper[4974]: W1013 18:16:59.626355 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd40e6084_5cc0_470e_866b_5c1de19b6875.slice/crio-e26632ed8a2d16cd00e60f9f55ac0bc93884729c38f9f781183f04d3b1cc26a6 WatchSource:0}: Error finding container e26632ed8a2d16cd00e60f9f55ac0bc93884729c38f9f781183f04d3b1cc26a6: Status 404 returned error can't find the container with id e26632ed8a2d16cd00e60f9f55ac0bc93884729c38f9f781183f04d3b1cc26a6 Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.194655 4974 generic.go:334] "Generic (PLEG): container finished" podID="d40e6084-5cc0-470e-866b-5c1de19b6875" containerID="f1b0dacb888c1ed6822872965dc27ab78e54fcbaa3127c6a6ebe2e9025654f23" exitCode=0 Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.194929 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zxdh" event={"ID":"d40e6084-5cc0-470e-866b-5c1de19b6875","Type":"ContainerDied","Data":"f1b0dacb888c1ed6822872965dc27ab78e54fcbaa3127c6a6ebe2e9025654f23"} Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.195010 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zxdh" event={"ID":"d40e6084-5cc0-470e-866b-5c1de19b6875","Type":"ContainerStarted","Data":"e26632ed8a2d16cd00e60f9f55ac0bc93884729c38f9f781183f04d3b1cc26a6"} Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.196553 4974 generic.go:334] "Generic (PLEG): container finished" podID="8d3b1917-2acd-461e-b659-6056041ee467" containerID="f5ea7f2c01ba63264105c9aca3b4731e72bded8016906c6af26a019ddfe25877" exitCode=0 Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.197077 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwtp5" event={"ID":"8d3b1917-2acd-461e-b659-6056041ee467","Type":"ContainerDied","Data":"f5ea7f2c01ba63264105c9aca3b4731e72bded8016906c6af26a019ddfe25877"} Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.197145 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwtp5" event={"ID":"8d3b1917-2acd-461e-b659-6056041ee467","Type":"ContainerStarted","Data":"1711b5f4ef9b9f6285e761be4255abd3e08308a63459325ab6da8d00487bf7ba"} Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.272770 4974 patch_prober.go:28] interesting pod/router-default-5444994796-9p2fb container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Oct 13 18:17:00 crc kubenswrapper[4974]: [+]has-synced ok Oct 13 18:17:00 crc kubenswrapper[4974]: [+]process-running ok Oct 13 18:17:00 crc kubenswrapper[4974]: healthz check failed Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.273130 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9p2fb" podUID="e713c27c-e93f-44ce-8e9a-d908099b699c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.518329 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.543408 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7429c3e4-2ad8-4373-807a-b69a11868c49-config-volume\") pod \"7429c3e4-2ad8-4373-807a-b69a11868c49\" (UID: \"7429c3e4-2ad8-4373-807a-b69a11868c49\") " Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.543469 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7429c3e4-2ad8-4373-807a-b69a11868c49-secret-volume\") pod \"7429c3e4-2ad8-4373-807a-b69a11868c49\" (UID: \"7429c3e4-2ad8-4373-807a-b69a11868c49\") " Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.543504 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knkc8\" (UniqueName: \"kubernetes.io/projected/7429c3e4-2ad8-4373-807a-b69a11868c49-kube-api-access-knkc8\") pod \"7429c3e4-2ad8-4373-807a-b69a11868c49\" (UID: \"7429c3e4-2ad8-4373-807a-b69a11868c49\") " Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.545070 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7429c3e4-2ad8-4373-807a-b69a11868c49-config-volume" (OuterVolumeSpecName: "config-volume") pod "7429c3e4-2ad8-4373-807a-b69a11868c49" (UID: "7429c3e4-2ad8-4373-807a-b69a11868c49"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.566783 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7429c3e4-2ad8-4373-807a-b69a11868c49-kube-api-access-knkc8" (OuterVolumeSpecName: "kube-api-access-knkc8") pod "7429c3e4-2ad8-4373-807a-b69a11868c49" (UID: "7429c3e4-2ad8-4373-807a-b69a11868c49"). InnerVolumeSpecName "kube-api-access-knkc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.567215 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7429c3e4-2ad8-4373-807a-b69a11868c49-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7429c3e4-2ad8-4373-807a-b69a11868c49" (UID: "7429c3e4-2ad8-4373-807a-b69a11868c49"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.644974 4974 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7429c3e4-2ad8-4373-807a-b69a11868c49-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.645010 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knkc8\" (UniqueName: \"kubernetes.io/projected/7429c3e4-2ad8-4373-807a-b69a11868c49-kube-api-access-knkc8\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:00 crc kubenswrapper[4974]: I1013 18:17:00.645021 4974 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7429c3e4-2ad8-4373-807a-b69a11868c49-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:01 crc kubenswrapper[4974]: I1013 18:17:01.232879 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" event={"ID":"7429c3e4-2ad8-4373-807a-b69a11868c49","Type":"ContainerDied","Data":"6987a458e60aa4a01a60b7ea12f5880471c2d961c909c4f01d58aa5a4237f997"} Oct 13 18:17:01 crc kubenswrapper[4974]: I1013 18:17:01.232919 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6987a458e60aa4a01a60b7ea12f5880471c2d961c909c4f01d58aa5a4237f997" Oct 13 18:17:01 crc kubenswrapper[4974]: I1013 18:17:01.232952 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh" Oct 13 18:17:01 crc kubenswrapper[4974]: I1013 18:17:01.273419 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:17:01 crc kubenswrapper[4974]: I1013 18:17:01.275589 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9p2fb" Oct 13 18:17:03 crc kubenswrapper[4974]: I1013 18:17:03.411129 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fgm8n" Oct 13 18:17:06 crc kubenswrapper[4974]: I1013 18:17:06.250134 4974 patch_prober.go:28] interesting pod/console-f9d7485db-jbznq container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Oct 13 18:17:06 crc kubenswrapper[4974]: I1013 18:17:06.250703 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-jbznq" podUID="ce0c606d-4062-4f6a-afec-752440b5580c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.8:8443/health\": dial tcp 10.217.0.8:8443: connect: connection refused" Oct 13 18:17:07 crc kubenswrapper[4974]: I1013 18:17:07.743227 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:17:07 crc kubenswrapper[4974]: I1013 18:17:07.743302 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:17:08 crc kubenswrapper[4974]: I1013 18:17:08.220178 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-rr898" Oct 13 18:17:10 crc kubenswrapper[4974]: I1013 18:17:10.024562 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:17:10 crc kubenswrapper[4974]: I1013 18:17:10.036320 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a260247c-2399-42b5-bddc-73e38659680b-metrics-certs\") pod \"network-metrics-daemon-z9hj4\" (UID: \"a260247c-2399-42b5-bddc-73e38659680b\") " pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:17:10 crc kubenswrapper[4974]: I1013 18:17:10.129256 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-z9hj4" Oct 13 18:17:15 crc kubenswrapper[4974]: I1013 18:17:15.470049 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:17:16 crc kubenswrapper[4974]: I1013 18:17:16.253597 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:17:16 crc kubenswrapper[4974]: I1013 18:17:16.257287 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:17:22 crc kubenswrapper[4974]: I1013 18:17:22.700955 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-z9hj4"] Oct 13 18:17:22 crc kubenswrapper[4974]: W1013 18:17:22.710150 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda260247c_2399_42b5_bddc_73e38659680b.slice/crio-6d939d777f4ba4dd9a1af2cc13b4d7f32db41bfdef2e0138cc88646a0fe5b945 WatchSource:0}: Error finding container 6d939d777f4ba4dd9a1af2cc13b4d7f32db41bfdef2e0138cc88646a0fe5b945: Status 404 returned error can't find the container with id 6d939d777f4ba4dd9a1af2cc13b4d7f32db41bfdef2e0138cc88646a0fe5b945 Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.422307 4974 generic.go:334] "Generic (PLEG): container finished" podID="4b122347-62b2-422f-96d0-7e725a331e1f" containerID="f674a806e053ee3233c1e7401807d78b1dbf28069e3eb293d2dd40f0b602041f" exitCode=0 Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.422448 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwj6w" event={"ID":"4b122347-62b2-422f-96d0-7e725a331e1f","Type":"ContainerDied","Data":"f674a806e053ee3233c1e7401807d78b1dbf28069e3eb293d2dd40f0b602041f"} Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.428442 4974 generic.go:334] "Generic (PLEG): container finished" podID="d40e6084-5cc0-470e-866b-5c1de19b6875" containerID="32cd5b9ead1faac7df92e66f040da13d0cd7ebb67c79af28b29d03f04404ef7b" exitCode=0 Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.428638 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zxdh" event={"ID":"d40e6084-5cc0-470e-866b-5c1de19b6875","Type":"ContainerDied","Data":"32cd5b9ead1faac7df92e66f040da13d0cd7ebb67c79af28b29d03f04404ef7b"} Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.432798 4974 generic.go:334] "Generic (PLEG): container finished" podID="eb659196-3de8-42d4-9fe6-b0c4c5e4de13" containerID="964ad6675532fb1bdedbcd6e94ce21bfc83beba2951683d15a0772cc8af2386c" exitCode=0 Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.432872 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjz9z" event={"ID":"eb659196-3de8-42d4-9fe6-b0c4c5e4de13","Type":"ContainerDied","Data":"964ad6675532fb1bdedbcd6e94ce21bfc83beba2951683d15a0772cc8af2386c"} Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.436018 4974 generic.go:334] "Generic (PLEG): container finished" podID="243c3013-e799-451c-82e3-05371075de32" containerID="24ca4a485f15c49ba659b0331a0bf64943c398b3f908d7c17875e7e0b7420141" exitCode=0 Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.436090 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhwmv" event={"ID":"243c3013-e799-451c-82e3-05371075de32","Type":"ContainerDied","Data":"24ca4a485f15c49ba659b0331a0bf64943c398b3f908d7c17875e7e0b7420141"} Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.439421 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" event={"ID":"a260247c-2399-42b5-bddc-73e38659680b","Type":"ContainerStarted","Data":"6d939d777f4ba4dd9a1af2cc13b4d7f32db41bfdef2e0138cc88646a0fe5b945"} Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.444433 4974 generic.go:334] "Generic (PLEG): container finished" podID="e79aa334-618e-4e55-9113-8627891e2962" containerID="1260d317917cea8b344bc87abf5ab731e348c2f6684c30c2b2fbb8eaf54e08ad" exitCode=0 Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.444522 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp9tn" event={"ID":"e79aa334-618e-4e55-9113-8627891e2962","Type":"ContainerDied","Data":"1260d317917cea8b344bc87abf5ab731e348c2f6684c30c2b2fbb8eaf54e08ad"} Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.465486 4974 generic.go:334] "Generic (PLEG): container finished" podID="8d3b1917-2acd-461e-b659-6056041ee467" containerID="7301841bbe06ac48e8e4130f0e264dc91cd5d7acb3b3884fe1cbf2e651258a8f" exitCode=0 Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.465620 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwtp5" event={"ID":"8d3b1917-2acd-461e-b659-6056041ee467","Type":"ContainerDied","Data":"7301841bbe06ac48e8e4130f0e264dc91cd5d7acb3b3884fe1cbf2e651258a8f"} Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.477489 4974 generic.go:334] "Generic (PLEG): container finished" podID="31f04939-3c35-4f5d-bea5-34b2c7c27d53" containerID="4d0dc2902be1b844733157d2ce294b2a2dbc0bbaa3596c86ebd6e0ada7f808f0" exitCode=0 Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.477611 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv9kj" event={"ID":"31f04939-3c35-4f5d-bea5-34b2c7c27d53","Type":"ContainerDied","Data":"4d0dc2902be1b844733157d2ce294b2a2dbc0bbaa3596c86ebd6e0ada7f808f0"} Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.495231 4974 generic.go:334] "Generic (PLEG): container finished" podID="4353b611-898f-42b9-8bfd-927ca6579832" containerID="1d940cf7d5e2fb2d372dd214a23dd9beed5c576c541068a7f5f568e6ffe4efb8" exitCode=0 Oct 13 18:17:23 crc kubenswrapper[4974]: I1013 18:17:23.495302 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p68v" event={"ID":"4353b611-898f-42b9-8bfd-927ca6579832","Type":"ContainerDied","Data":"1d940cf7d5e2fb2d372dd214a23dd9beed5c576c541068a7f5f568e6ffe4efb8"} Oct 13 18:17:24 crc kubenswrapper[4974]: I1013 18:17:24.512151 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" event={"ID":"a260247c-2399-42b5-bddc-73e38659680b","Type":"ContainerStarted","Data":"337738a369eac7f63453fadabbfa364357501b5146188a20c819813b3b6f0017"} Oct 13 18:17:24 crc kubenswrapper[4974]: I1013 18:17:24.514247 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-z9hj4" event={"ID":"a260247c-2399-42b5-bddc-73e38659680b","Type":"ContainerStarted","Data":"801e6736b8de2993cd760f0677e048ec37a0cba460b4f6353d28735819480a3c"} Oct 13 18:17:24 crc kubenswrapper[4974]: I1013 18:17:24.535041 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-z9hj4" podStartSLOduration=157.535012521 podStartE2EDuration="2m37.535012521s" podCreationTimestamp="2025-10-13 18:14:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:17:24.53107374 +0000 UTC m=+179.435439880" watchObservedRunningTime="2025-10-13 18:17:24.535012521 +0000 UTC m=+179.439378631" Oct 13 18:17:26 crc kubenswrapper[4974]: I1013 18:17:26.540781 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwj6w" event={"ID":"4b122347-62b2-422f-96d0-7e725a331e1f","Type":"ContainerStarted","Data":"baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707"} Oct 13 18:17:26 crc kubenswrapper[4974]: I1013 18:17:26.572940 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zwj6w" podStartSLOduration=3.276905697 podStartE2EDuration="29.572912802s" podCreationTimestamp="2025-10-13 18:16:57 +0000 UTC" firstStartedPulling="2025-10-13 18:16:59.188426416 +0000 UTC m=+154.092792496" lastFinishedPulling="2025-10-13 18:17:25.484433491 +0000 UTC m=+180.388799601" observedRunningTime="2025-10-13 18:17:26.564465485 +0000 UTC m=+181.468831605" watchObservedRunningTime="2025-10-13 18:17:26.572912802 +0000 UTC m=+181.477278922" Oct 13 18:17:27 crc kubenswrapper[4974]: I1013 18:17:27.549983 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zxdh" event={"ID":"d40e6084-5cc0-470e-866b-5c1de19b6875","Type":"ContainerStarted","Data":"e9eec7726ac18defeb6b5eec37a889b2c4aaf8d3b253f1eb4ea08922d55d0b11"} Oct 13 18:17:27 crc kubenswrapper[4974]: I1013 18:17:27.556282 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwtp5" event={"ID":"8d3b1917-2acd-461e-b659-6056041ee467","Type":"ContainerStarted","Data":"37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3"} Oct 13 18:17:27 crc kubenswrapper[4974]: I1013 18:17:27.558414 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv9kj" event={"ID":"31f04939-3c35-4f5d-bea5-34b2c7c27d53","Type":"ContainerStarted","Data":"d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422"} Oct 13 18:17:27 crc kubenswrapper[4974]: I1013 18:17:27.559926 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhwmv" event={"ID":"243c3013-e799-451c-82e3-05371075de32","Type":"ContainerStarted","Data":"5140cbd95fc64ba8f97b8db17379a1f3bb2ecbc251f3481aafeff2924a195fc4"} Oct 13 18:17:27 crc kubenswrapper[4974]: I1013 18:17:27.562236 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p68v" event={"ID":"4353b611-898f-42b9-8bfd-927ca6579832","Type":"ContainerStarted","Data":"70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015"} Oct 13 18:17:27 crc kubenswrapper[4974]: I1013 18:17:27.576013 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9zxdh" podStartSLOduration=2.938654075 podStartE2EDuration="29.575994437s" podCreationTimestamp="2025-10-13 18:16:58 +0000 UTC" firstStartedPulling="2025-10-13 18:17:00.197216482 +0000 UTC m=+155.101582562" lastFinishedPulling="2025-10-13 18:17:26.834556814 +0000 UTC m=+181.738922924" observedRunningTime="2025-10-13 18:17:27.568876447 +0000 UTC m=+182.473242537" watchObservedRunningTime="2025-10-13 18:17:27.575994437 +0000 UTC m=+182.480360517" Oct 13 18:17:27 crc kubenswrapper[4974]: I1013 18:17:27.594263 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kwtp5" podStartSLOduration=2.5294228629999997 podStartE2EDuration="29.594245309s" podCreationTimestamp="2025-10-13 18:16:58 +0000 UTC" firstStartedPulling="2025-10-13 18:17:00.202215132 +0000 UTC m=+155.106581202" lastFinishedPulling="2025-10-13 18:17:27.267037568 +0000 UTC m=+182.171403648" observedRunningTime="2025-10-13 18:17:27.593329243 +0000 UTC m=+182.497695343" watchObservedRunningTime="2025-10-13 18:17:27.594245309 +0000 UTC m=+182.498611389" Oct 13 18:17:27 crc kubenswrapper[4974]: I1013 18:17:27.630386 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bhwmv" podStartSLOduration=2.623628 podStartE2EDuration="32.630365312s" podCreationTimestamp="2025-10-13 18:16:55 +0000 UTC" firstStartedPulling="2025-10-13 18:16:57.155528706 +0000 UTC m=+152.059894786" lastFinishedPulling="2025-10-13 18:17:27.162266018 +0000 UTC m=+182.066632098" observedRunningTime="2025-10-13 18:17:27.62635807 +0000 UTC m=+182.530724160" watchObservedRunningTime="2025-10-13 18:17:27.630365312 +0000 UTC m=+182.534731402" Oct 13 18:17:27 crc kubenswrapper[4974]: I1013 18:17:27.649598 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jv9kj" podStartSLOduration=2.429304918 podStartE2EDuration="32.649579732s" podCreationTimestamp="2025-10-13 18:16:55 +0000 UTC" firstStartedPulling="2025-10-13 18:16:57.147082669 +0000 UTC m=+152.051448749" lastFinishedPulling="2025-10-13 18:17:27.367357473 +0000 UTC m=+182.271723563" observedRunningTime="2025-10-13 18:17:27.646508185 +0000 UTC m=+182.550874265" watchObservedRunningTime="2025-10-13 18:17:27.649579732 +0000 UTC m=+182.553945812" Oct 13 18:17:27 crc kubenswrapper[4974]: I1013 18:17:27.664753 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8p68v" podStartSLOduration=2.673257793 podStartE2EDuration="32.664738287s" podCreationTimestamp="2025-10-13 18:16:55 +0000 UTC" firstStartedPulling="2025-10-13 18:16:57.15888911 +0000 UTC m=+152.063255190" lastFinishedPulling="2025-10-13 18:17:27.150369564 +0000 UTC m=+182.054735684" observedRunningTime="2025-10-13 18:17:27.662575476 +0000 UTC m=+182.566941556" watchObservedRunningTime="2025-10-13 18:17:27.664738287 +0000 UTC m=+182.569104367" Oct 13 18:17:28 crc kubenswrapper[4974]: I1013 18:17:28.063025 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:17:28 crc kubenswrapper[4974]: I1013 18:17:28.063138 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:17:28 crc kubenswrapper[4974]: I1013 18:17:28.567525 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp9tn" event={"ID":"e79aa334-618e-4e55-9113-8627891e2962","Type":"ContainerStarted","Data":"7a58db54e098d096814f213452655b31adcf7df89a2887f1c419faffa33d1549"} Oct 13 18:17:28 crc kubenswrapper[4974]: I1013 18:17:28.570139 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjz9z" event={"ID":"eb659196-3de8-42d4-9fe6-b0c4c5e4de13","Type":"ContainerStarted","Data":"d78bac88bc91b2d90ad7a5a190933d35a85dd9de30e2c356f78cc814ac672f06"} Oct 13 18:17:28 crc kubenswrapper[4974]: I1013 18:17:28.587098 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fp9tn" podStartSLOduration=3.311645392 podStartE2EDuration="31.587081657s" podCreationTimestamp="2025-10-13 18:16:57 +0000 UTC" firstStartedPulling="2025-10-13 18:16:59.19534192 +0000 UTC m=+154.099708000" lastFinishedPulling="2025-10-13 18:17:27.470778185 +0000 UTC m=+182.375144265" observedRunningTime="2025-10-13 18:17:28.58397184 +0000 UTC m=+183.488337920" watchObservedRunningTime="2025-10-13 18:17:28.587081657 +0000 UTC m=+183.491447737" Oct 13 18:17:28 crc kubenswrapper[4974]: I1013 18:17:28.607502 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pjz9z" podStartSLOduration=3.271934801 podStartE2EDuration="33.607481449s" podCreationTimestamp="2025-10-13 18:16:55 +0000 UTC" firstStartedPulling="2025-10-13 18:16:57.14319969 +0000 UTC m=+152.047565810" lastFinishedPulling="2025-10-13 18:17:27.478746378 +0000 UTC m=+182.383112458" observedRunningTime="2025-10-13 18:17:28.605087382 +0000 UTC m=+183.509453472" watchObservedRunningTime="2025-10-13 18:17:28.607481449 +0000 UTC m=+183.511847529" Oct 13 18:17:28 crc kubenswrapper[4974]: I1013 18:17:28.652351 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-jtjwq" Oct 13 18:17:28 crc kubenswrapper[4974]: I1013 18:17:28.870450 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:17:28 crc kubenswrapper[4974]: I1013 18:17:28.870741 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:17:29 crc kubenswrapper[4974]: I1013 18:17:29.241105 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-zwj6w" podUID="4b122347-62b2-422f-96d0-7e725a331e1f" containerName="registry-server" probeResult="failure" output=< Oct 13 18:17:29 crc kubenswrapper[4974]: timeout: failed to connect service ":50051" within 1s Oct 13 18:17:29 crc kubenswrapper[4974]: > Oct 13 18:17:29 crc kubenswrapper[4974]: I1013 18:17:29.256273 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:17:29 crc kubenswrapper[4974]: I1013 18:17:29.256333 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:17:29 crc kubenswrapper[4974]: I1013 18:17:29.906531 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-kwtp5" podUID="8d3b1917-2acd-461e-b659-6056041ee467" containerName="registry-server" probeResult="failure" output=< Oct 13 18:17:29 crc kubenswrapper[4974]: timeout: failed to connect service ":50051" within 1s Oct 13 18:17:29 crc kubenswrapper[4974]: > Oct 13 18:17:30 crc kubenswrapper[4974]: I1013 18:17:30.300251 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9zxdh" podUID="d40e6084-5cc0-470e-866b-5c1de19b6875" containerName="registry-server" probeResult="failure" output=< Oct 13 18:17:30 crc kubenswrapper[4974]: timeout: failed to connect service ":50051" within 1s Oct 13 18:17:30 crc kubenswrapper[4974]: > Oct 13 18:17:33 crc kubenswrapper[4974]: I1013 18:17:33.842747 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Oct 13 18:17:35 crc kubenswrapper[4974]: I1013 18:17:35.627144 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:17:35 crc kubenswrapper[4974]: I1013 18:17:35.627846 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:17:35 crc kubenswrapper[4974]: I1013 18:17:35.705412 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:17:35 crc kubenswrapper[4974]: I1013 18:17:35.849877 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:17:35 crc kubenswrapper[4974]: I1013 18:17:35.849936 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:17:35 crc kubenswrapper[4974]: I1013 18:17:35.914506 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:17:36 crc kubenswrapper[4974]: I1013 18:17:36.060144 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:17:36 crc kubenswrapper[4974]: I1013 18:17:36.060540 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:17:36 crc kubenswrapper[4974]: I1013 18:17:36.104952 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:17:36 crc kubenswrapper[4974]: I1013 18:17:36.239468 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:17:36 crc kubenswrapper[4974]: I1013 18:17:36.239506 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:17:36 crc kubenswrapper[4974]: I1013 18:17:36.278846 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:17:36 crc kubenswrapper[4974]: I1013 18:17:36.677443 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:17:36 crc kubenswrapper[4974]: I1013 18:17:36.689494 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:17:36 crc kubenswrapper[4974]: I1013 18:17:36.689884 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:17:36 crc kubenswrapper[4974]: I1013 18:17:36.692560 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:17:37 crc kubenswrapper[4974]: I1013 18:17:37.552871 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8p68v"] Oct 13 18:17:37 crc kubenswrapper[4974]: I1013 18:17:37.636992 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:17:37 crc kubenswrapper[4974]: I1013 18:17:37.637818 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:17:37 crc kubenswrapper[4974]: I1013 18:17:37.694888 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:17:37 crc kubenswrapper[4974]: I1013 18:17:37.742820 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:17:37 crc kubenswrapper[4974]: I1013 18:17:37.742889 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:17:38 crc kubenswrapper[4974]: I1013 18:17:38.123137 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:17:38 crc kubenswrapper[4974]: I1013 18:17:38.184030 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:17:38 crc kubenswrapper[4974]: I1013 18:17:38.544768 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jv9kj"] Oct 13 18:17:38 crc kubenswrapper[4974]: I1013 18:17:38.627133 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8p68v" podUID="4353b611-898f-42b9-8bfd-927ca6579832" containerName="registry-server" containerID="cri-o://70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015" gracePeriod=2 Oct 13 18:17:38 crc kubenswrapper[4974]: I1013 18:17:38.629169 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jv9kj" podUID="31f04939-3c35-4f5d-bea5-34b2c7c27d53" containerName="registry-server" containerID="cri-o://d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422" gracePeriod=2 Oct 13 18:17:38 crc kubenswrapper[4974]: I1013 18:17:38.681959 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:17:38 crc kubenswrapper[4974]: I1013 18:17:38.904585 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:17:38 crc kubenswrapper[4974]: I1013 18:17:38.956378 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.143584 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.190253 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.258413 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31f04939-3c35-4f5d-bea5-34b2c7c27d53-catalog-content\") pod \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\" (UID: \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\") " Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.258494 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31f04939-3c35-4f5d-bea5-34b2c7c27d53-utilities\") pod \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\" (UID: \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\") " Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.258569 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hsqw\" (UniqueName: \"kubernetes.io/projected/31f04939-3c35-4f5d-bea5-34b2c7c27d53-kube-api-access-8hsqw\") pod \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\" (UID: \"31f04939-3c35-4f5d-bea5-34b2c7c27d53\") " Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.259475 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31f04939-3c35-4f5d-bea5-34b2c7c27d53-utilities" (OuterVolumeSpecName: "utilities") pod "31f04939-3c35-4f5d-bea5-34b2c7c27d53" (UID: "31f04939-3c35-4f5d-bea5-34b2c7c27d53"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.268716 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31f04939-3c35-4f5d-bea5-34b2c7c27d53-kube-api-access-8hsqw" (OuterVolumeSpecName: "kube-api-access-8hsqw") pod "31f04939-3c35-4f5d-bea5-34b2c7c27d53" (UID: "31f04939-3c35-4f5d-bea5-34b2c7c27d53"). InnerVolumeSpecName "kube-api-access-8hsqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.299326 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31f04939-3c35-4f5d-bea5-34b2c7c27d53-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31f04939-3c35-4f5d-bea5-34b2c7c27d53" (UID: "31f04939-3c35-4f5d-bea5-34b2c7c27d53"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.307900 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.357120 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.361108 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvfgz\" (UniqueName: \"kubernetes.io/projected/4353b611-898f-42b9-8bfd-927ca6579832-kube-api-access-xvfgz\") pod \"4353b611-898f-42b9-8bfd-927ca6579832\" (UID: \"4353b611-898f-42b9-8bfd-927ca6579832\") " Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.361491 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4353b611-898f-42b9-8bfd-927ca6579832-catalog-content\") pod \"4353b611-898f-42b9-8bfd-927ca6579832\" (UID: \"4353b611-898f-42b9-8bfd-927ca6579832\") " Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.361529 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4353b611-898f-42b9-8bfd-927ca6579832-utilities\") pod \"4353b611-898f-42b9-8bfd-927ca6579832\" (UID: \"4353b611-898f-42b9-8bfd-927ca6579832\") " Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.361701 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hsqw\" (UniqueName: \"kubernetes.io/projected/31f04939-3c35-4f5d-bea5-34b2c7c27d53-kube-api-access-8hsqw\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.361712 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31f04939-3c35-4f5d-bea5-34b2c7c27d53-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.361722 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31f04939-3c35-4f5d-bea5-34b2c7c27d53-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.362672 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4353b611-898f-42b9-8bfd-927ca6579832-utilities" (OuterVolumeSpecName: "utilities") pod "4353b611-898f-42b9-8bfd-927ca6579832" (UID: "4353b611-898f-42b9-8bfd-927ca6579832"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.367981 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4353b611-898f-42b9-8bfd-927ca6579832-kube-api-access-xvfgz" (OuterVolumeSpecName: "kube-api-access-xvfgz") pod "4353b611-898f-42b9-8bfd-927ca6579832" (UID: "4353b611-898f-42b9-8bfd-927ca6579832"). InnerVolumeSpecName "kube-api-access-xvfgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.418309 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4353b611-898f-42b9-8bfd-927ca6579832-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4353b611-898f-42b9-8bfd-927ca6579832" (UID: "4353b611-898f-42b9-8bfd-927ca6579832"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.462544 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4353b611-898f-42b9-8bfd-927ca6579832-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.463041 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4353b611-898f-42b9-8bfd-927ca6579832-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.463114 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvfgz\" (UniqueName: \"kubernetes.io/projected/4353b611-898f-42b9-8bfd-927ca6579832-kube-api-access-xvfgz\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.649116 4974 generic.go:334] "Generic (PLEG): container finished" podID="4353b611-898f-42b9-8bfd-927ca6579832" containerID="70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015" exitCode=0 Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.649171 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p68v" event={"ID":"4353b611-898f-42b9-8bfd-927ca6579832","Type":"ContainerDied","Data":"70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015"} Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.649200 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p68v" event={"ID":"4353b611-898f-42b9-8bfd-927ca6579832","Type":"ContainerDied","Data":"fdcd261609a62381489edbf73261bc82285e8a956c03e0fc5db3f0d6b5637f38"} Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.649217 4974 scope.go:117] "RemoveContainer" containerID="70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.649327 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p68v" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.657245 4974 generic.go:334] "Generic (PLEG): container finished" podID="31f04939-3c35-4f5d-bea5-34b2c7c27d53" containerID="d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422" exitCode=0 Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.657313 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jv9kj" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.657308 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv9kj" event={"ID":"31f04939-3c35-4f5d-bea5-34b2c7c27d53","Type":"ContainerDied","Data":"d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422"} Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.657576 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jv9kj" event={"ID":"31f04939-3c35-4f5d-bea5-34b2c7c27d53","Type":"ContainerDied","Data":"d1f25bdd8a2250dbeffea83ffa64cd0c0a3b183cf2521b5a3d6945e242acc917"} Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.682043 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8p68v"] Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.684525 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8p68v"] Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.685759 4974 scope.go:117] "RemoveContainer" containerID="1d940cf7d5e2fb2d372dd214a23dd9beed5c576c541068a7f5f568e6ffe4efb8" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.695332 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jv9kj"] Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.700387 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jv9kj"] Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.724286 4974 scope.go:117] "RemoveContainer" containerID="1b4b815dbec4de311634a2e0ddd29e37cf861b97b3c233b9be59ca7eb03449cb" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.741384 4974 scope.go:117] "RemoveContainer" containerID="70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015" Oct 13 18:17:39 crc kubenswrapper[4974]: E1013 18:17:39.742397 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015\": container with ID starting with 70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015 not found: ID does not exist" containerID="70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.742436 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015"} err="failed to get container status \"70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015\": rpc error: code = NotFound desc = could not find container \"70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015\": container with ID starting with 70d110fc25ac25b5d2bb540ed2e7ea4741e1be86885a2e2fb63a226d9a4a0015 not found: ID does not exist" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.742476 4974 scope.go:117] "RemoveContainer" containerID="1d940cf7d5e2fb2d372dd214a23dd9beed5c576c541068a7f5f568e6ffe4efb8" Oct 13 18:17:39 crc kubenswrapper[4974]: E1013 18:17:39.743131 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d940cf7d5e2fb2d372dd214a23dd9beed5c576c541068a7f5f568e6ffe4efb8\": container with ID starting with 1d940cf7d5e2fb2d372dd214a23dd9beed5c576c541068a7f5f568e6ffe4efb8 not found: ID does not exist" containerID="1d940cf7d5e2fb2d372dd214a23dd9beed5c576c541068a7f5f568e6ffe4efb8" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.743163 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d940cf7d5e2fb2d372dd214a23dd9beed5c576c541068a7f5f568e6ffe4efb8"} err="failed to get container status \"1d940cf7d5e2fb2d372dd214a23dd9beed5c576c541068a7f5f568e6ffe4efb8\": rpc error: code = NotFound desc = could not find container \"1d940cf7d5e2fb2d372dd214a23dd9beed5c576c541068a7f5f568e6ffe4efb8\": container with ID starting with 1d940cf7d5e2fb2d372dd214a23dd9beed5c576c541068a7f5f568e6ffe4efb8 not found: ID does not exist" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.743181 4974 scope.go:117] "RemoveContainer" containerID="1b4b815dbec4de311634a2e0ddd29e37cf861b97b3c233b9be59ca7eb03449cb" Oct 13 18:17:39 crc kubenswrapper[4974]: E1013 18:17:39.743538 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b4b815dbec4de311634a2e0ddd29e37cf861b97b3c233b9be59ca7eb03449cb\": container with ID starting with 1b4b815dbec4de311634a2e0ddd29e37cf861b97b3c233b9be59ca7eb03449cb not found: ID does not exist" containerID="1b4b815dbec4de311634a2e0ddd29e37cf861b97b3c233b9be59ca7eb03449cb" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.743672 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b4b815dbec4de311634a2e0ddd29e37cf861b97b3c233b9be59ca7eb03449cb"} err="failed to get container status \"1b4b815dbec4de311634a2e0ddd29e37cf861b97b3c233b9be59ca7eb03449cb\": rpc error: code = NotFound desc = could not find container \"1b4b815dbec4de311634a2e0ddd29e37cf861b97b3c233b9be59ca7eb03449cb\": container with ID starting with 1b4b815dbec4de311634a2e0ddd29e37cf861b97b3c233b9be59ca7eb03449cb not found: ID does not exist" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.743770 4974 scope.go:117] "RemoveContainer" containerID="d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.776530 4974 scope.go:117] "RemoveContainer" containerID="4d0dc2902be1b844733157d2ce294b2a2dbc0bbaa3596c86ebd6e0ada7f808f0" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.791442 4974 scope.go:117] "RemoveContainer" containerID="5ee5d21eaecddb3ebd4f193eeee9551b9356abfea32c310eeb2e812af5fe5770" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.809518 4974 scope.go:117] "RemoveContainer" containerID="d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422" Oct 13 18:17:39 crc kubenswrapper[4974]: E1013 18:17:39.810079 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422\": container with ID starting with d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422 not found: ID does not exist" containerID="d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.810121 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422"} err="failed to get container status \"d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422\": rpc error: code = NotFound desc = could not find container \"d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422\": container with ID starting with d40d38016882c90ca9d9d15196af4c2a9d09859987711aa24a176533ffd33422 not found: ID does not exist" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.810155 4974 scope.go:117] "RemoveContainer" containerID="4d0dc2902be1b844733157d2ce294b2a2dbc0bbaa3596c86ebd6e0ada7f808f0" Oct 13 18:17:39 crc kubenswrapper[4974]: E1013 18:17:39.815154 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d0dc2902be1b844733157d2ce294b2a2dbc0bbaa3596c86ebd6e0ada7f808f0\": container with ID starting with 4d0dc2902be1b844733157d2ce294b2a2dbc0bbaa3596c86ebd6e0ada7f808f0 not found: ID does not exist" containerID="4d0dc2902be1b844733157d2ce294b2a2dbc0bbaa3596c86ebd6e0ada7f808f0" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.815285 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d0dc2902be1b844733157d2ce294b2a2dbc0bbaa3596c86ebd6e0ada7f808f0"} err="failed to get container status \"4d0dc2902be1b844733157d2ce294b2a2dbc0bbaa3596c86ebd6e0ada7f808f0\": rpc error: code = NotFound desc = could not find container \"4d0dc2902be1b844733157d2ce294b2a2dbc0bbaa3596c86ebd6e0ada7f808f0\": container with ID starting with 4d0dc2902be1b844733157d2ce294b2a2dbc0bbaa3596c86ebd6e0ada7f808f0 not found: ID does not exist" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.815408 4974 scope.go:117] "RemoveContainer" containerID="5ee5d21eaecddb3ebd4f193eeee9551b9356abfea32c310eeb2e812af5fe5770" Oct 13 18:17:39 crc kubenswrapper[4974]: E1013 18:17:39.815861 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ee5d21eaecddb3ebd4f193eeee9551b9356abfea32c310eeb2e812af5fe5770\": container with ID starting with 5ee5d21eaecddb3ebd4f193eeee9551b9356abfea32c310eeb2e812af5fe5770 not found: ID does not exist" containerID="5ee5d21eaecddb3ebd4f193eeee9551b9356abfea32c310eeb2e812af5fe5770" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.815886 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ee5d21eaecddb3ebd4f193eeee9551b9356abfea32c310eeb2e812af5fe5770"} err="failed to get container status \"5ee5d21eaecddb3ebd4f193eeee9551b9356abfea32c310eeb2e812af5fe5770\": rpc error: code = NotFound desc = could not find container \"5ee5d21eaecddb3ebd4f193eeee9551b9356abfea32c310eeb2e812af5fe5770\": container with ID starting with 5ee5d21eaecddb3ebd4f193eeee9551b9356abfea32c310eeb2e812af5fe5770 not found: ID does not exist" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.819975 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31f04939-3c35-4f5d-bea5-34b2c7c27d53" path="/var/lib/kubelet/pods/31f04939-3c35-4f5d-bea5-34b2c7c27d53/volumes" Oct 13 18:17:39 crc kubenswrapper[4974]: I1013 18:17:39.820754 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4353b611-898f-42b9-8bfd-927ca6579832" path="/var/lib/kubelet/pods/4353b611-898f-42b9-8bfd-927ca6579832/volumes" Oct 13 18:17:40 crc kubenswrapper[4974]: I1013 18:17:40.948184 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwj6w"] Oct 13 18:17:40 crc kubenswrapper[4974]: I1013 18:17:40.949032 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-zwj6w" podUID="4b122347-62b2-422f-96d0-7e725a331e1f" containerName="registry-server" containerID="cri-o://baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707" gracePeriod=2 Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.359856 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.386232 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b122347-62b2-422f-96d0-7e725a331e1f-catalog-content\") pod \"4b122347-62b2-422f-96d0-7e725a331e1f\" (UID: \"4b122347-62b2-422f-96d0-7e725a331e1f\") " Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.386389 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b122347-62b2-422f-96d0-7e725a331e1f-utilities\") pod \"4b122347-62b2-422f-96d0-7e725a331e1f\" (UID: \"4b122347-62b2-422f-96d0-7e725a331e1f\") " Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.386421 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcxv4\" (UniqueName: \"kubernetes.io/projected/4b122347-62b2-422f-96d0-7e725a331e1f-kube-api-access-fcxv4\") pod \"4b122347-62b2-422f-96d0-7e725a331e1f\" (UID: \"4b122347-62b2-422f-96d0-7e725a331e1f\") " Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.387593 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b122347-62b2-422f-96d0-7e725a331e1f-utilities" (OuterVolumeSpecName: "utilities") pod "4b122347-62b2-422f-96d0-7e725a331e1f" (UID: "4b122347-62b2-422f-96d0-7e725a331e1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.392912 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b122347-62b2-422f-96d0-7e725a331e1f-kube-api-access-fcxv4" (OuterVolumeSpecName: "kube-api-access-fcxv4") pod "4b122347-62b2-422f-96d0-7e725a331e1f" (UID: "4b122347-62b2-422f-96d0-7e725a331e1f"). InnerVolumeSpecName "kube-api-access-fcxv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.414696 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b122347-62b2-422f-96d0-7e725a331e1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4b122347-62b2-422f-96d0-7e725a331e1f" (UID: "4b122347-62b2-422f-96d0-7e725a331e1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.487367 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4b122347-62b2-422f-96d0-7e725a331e1f-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.487445 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcxv4\" (UniqueName: \"kubernetes.io/projected/4b122347-62b2-422f-96d0-7e725a331e1f-kube-api-access-fcxv4\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.487468 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4b122347-62b2-422f-96d0-7e725a331e1f-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.671859 4974 generic.go:334] "Generic (PLEG): container finished" podID="4b122347-62b2-422f-96d0-7e725a331e1f" containerID="baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707" exitCode=0 Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.671900 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwj6w" event={"ID":"4b122347-62b2-422f-96d0-7e725a331e1f","Type":"ContainerDied","Data":"baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707"} Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.671932 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zwj6w" event={"ID":"4b122347-62b2-422f-96d0-7e725a331e1f","Type":"ContainerDied","Data":"adbc19e3a968ff3fc7fce0040de75f2bf8870748f135f9593ba4cea8c89e3edf"} Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.671936 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zwj6w" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.671952 4974 scope.go:117] "RemoveContainer" containerID="baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.689949 4974 scope.go:117] "RemoveContainer" containerID="f674a806e053ee3233c1e7401807d78b1dbf28069e3eb293d2dd40f0b602041f" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.698422 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwj6w"] Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.700771 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-zwj6w"] Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.710808 4974 scope.go:117] "RemoveContainer" containerID="f90df1b0c9bcea92c397cb468ed8a9d7457af15500766c6d0b3878acb81a6f94" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.726206 4974 scope.go:117] "RemoveContainer" containerID="baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707" Oct 13 18:17:41 crc kubenswrapper[4974]: E1013 18:17:41.726507 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707\": container with ID starting with baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707 not found: ID does not exist" containerID="baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.726537 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707"} err="failed to get container status \"baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707\": rpc error: code = NotFound desc = could not find container \"baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707\": container with ID starting with baf343583c77c9dd8a8f71b778a456892b9bb5e6b70c54a909ddd2b432e8d707 not found: ID does not exist" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.726560 4974 scope.go:117] "RemoveContainer" containerID="f674a806e053ee3233c1e7401807d78b1dbf28069e3eb293d2dd40f0b602041f" Oct 13 18:17:41 crc kubenswrapper[4974]: E1013 18:17:41.726821 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f674a806e053ee3233c1e7401807d78b1dbf28069e3eb293d2dd40f0b602041f\": container with ID starting with f674a806e053ee3233c1e7401807d78b1dbf28069e3eb293d2dd40f0b602041f not found: ID does not exist" containerID="f674a806e053ee3233c1e7401807d78b1dbf28069e3eb293d2dd40f0b602041f" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.726848 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f674a806e053ee3233c1e7401807d78b1dbf28069e3eb293d2dd40f0b602041f"} err="failed to get container status \"f674a806e053ee3233c1e7401807d78b1dbf28069e3eb293d2dd40f0b602041f\": rpc error: code = NotFound desc = could not find container \"f674a806e053ee3233c1e7401807d78b1dbf28069e3eb293d2dd40f0b602041f\": container with ID starting with f674a806e053ee3233c1e7401807d78b1dbf28069e3eb293d2dd40f0b602041f not found: ID does not exist" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.726862 4974 scope.go:117] "RemoveContainer" containerID="f90df1b0c9bcea92c397cb468ed8a9d7457af15500766c6d0b3878acb81a6f94" Oct 13 18:17:41 crc kubenswrapper[4974]: E1013 18:17:41.727083 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f90df1b0c9bcea92c397cb468ed8a9d7457af15500766c6d0b3878acb81a6f94\": container with ID starting with f90df1b0c9bcea92c397cb468ed8a9d7457af15500766c6d0b3878acb81a6f94 not found: ID does not exist" containerID="f90df1b0c9bcea92c397cb468ed8a9d7457af15500766c6d0b3878acb81a6f94" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.727103 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f90df1b0c9bcea92c397cb468ed8a9d7457af15500766c6d0b3878acb81a6f94"} err="failed to get container status \"f90df1b0c9bcea92c397cb468ed8a9d7457af15500766c6d0b3878acb81a6f94\": rpc error: code = NotFound desc = could not find container \"f90df1b0c9bcea92c397cb468ed8a9d7457af15500766c6d0b3878acb81a6f94\": container with ID starting with f90df1b0c9bcea92c397cb468ed8a9d7457af15500766c6d0b3878acb81a6f94 not found: ID does not exist" Oct 13 18:17:41 crc kubenswrapper[4974]: I1013 18:17:41.817907 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b122347-62b2-422f-96d0-7e725a331e1f" path="/var/lib/kubelet/pods/4b122347-62b2-422f-96d0-7e725a331e1f/volumes" Oct 13 18:17:43 crc kubenswrapper[4974]: I1013 18:17:43.343677 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9zxdh"] Oct 13 18:17:43 crc kubenswrapper[4974]: I1013 18:17:43.344571 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9zxdh" podUID="d40e6084-5cc0-470e-866b-5c1de19b6875" containerName="registry-server" containerID="cri-o://e9eec7726ac18defeb6b5eec37a889b2c4aaf8d3b253f1eb4ea08922d55d0b11" gracePeriod=2 Oct 13 18:17:43 crc kubenswrapper[4974]: I1013 18:17:43.687089 4974 generic.go:334] "Generic (PLEG): container finished" podID="d40e6084-5cc0-470e-866b-5c1de19b6875" containerID="e9eec7726ac18defeb6b5eec37a889b2c4aaf8d3b253f1eb4ea08922d55d0b11" exitCode=0 Oct 13 18:17:43 crc kubenswrapper[4974]: I1013 18:17:43.687135 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zxdh" event={"ID":"d40e6084-5cc0-470e-866b-5c1de19b6875","Type":"ContainerDied","Data":"e9eec7726ac18defeb6b5eec37a889b2c4aaf8d3b253f1eb4ea08922d55d0b11"} Oct 13 18:17:43 crc kubenswrapper[4974]: I1013 18:17:43.777358 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:17:43 crc kubenswrapper[4974]: I1013 18:17:43.913263 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d40e6084-5cc0-470e-866b-5c1de19b6875-utilities\") pod \"d40e6084-5cc0-470e-866b-5c1de19b6875\" (UID: \"d40e6084-5cc0-470e-866b-5c1de19b6875\") " Oct 13 18:17:43 crc kubenswrapper[4974]: I1013 18:17:43.913345 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d40e6084-5cc0-470e-866b-5c1de19b6875-catalog-content\") pod \"d40e6084-5cc0-470e-866b-5c1de19b6875\" (UID: \"d40e6084-5cc0-470e-866b-5c1de19b6875\") " Oct 13 18:17:43 crc kubenswrapper[4974]: I1013 18:17:43.913419 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjnxd\" (UniqueName: \"kubernetes.io/projected/d40e6084-5cc0-470e-866b-5c1de19b6875-kube-api-access-jjnxd\") pod \"d40e6084-5cc0-470e-866b-5c1de19b6875\" (UID: \"d40e6084-5cc0-470e-866b-5c1de19b6875\") " Oct 13 18:17:43 crc kubenswrapper[4974]: I1013 18:17:43.914132 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d40e6084-5cc0-470e-866b-5c1de19b6875-utilities" (OuterVolumeSpecName: "utilities") pod "d40e6084-5cc0-470e-866b-5c1de19b6875" (UID: "d40e6084-5cc0-470e-866b-5c1de19b6875"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:17:43 crc kubenswrapper[4974]: I1013 18:17:43.926808 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d40e6084-5cc0-470e-866b-5c1de19b6875-kube-api-access-jjnxd" (OuterVolumeSpecName: "kube-api-access-jjnxd") pod "d40e6084-5cc0-470e-866b-5c1de19b6875" (UID: "d40e6084-5cc0-470e-866b-5c1de19b6875"). InnerVolumeSpecName "kube-api-access-jjnxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:17:43 crc kubenswrapper[4974]: I1013 18:17:43.991359 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d40e6084-5cc0-470e-866b-5c1de19b6875-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d40e6084-5cc0-470e-866b-5c1de19b6875" (UID: "d40e6084-5cc0-470e-866b-5c1de19b6875"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:17:44 crc kubenswrapper[4974]: I1013 18:17:44.015123 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjnxd\" (UniqueName: \"kubernetes.io/projected/d40e6084-5cc0-470e-866b-5c1de19b6875-kube-api-access-jjnxd\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:44 crc kubenswrapper[4974]: I1013 18:17:44.015173 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d40e6084-5cc0-470e-866b-5c1de19b6875-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:44 crc kubenswrapper[4974]: I1013 18:17:44.015192 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d40e6084-5cc0-470e-866b-5c1de19b6875-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:17:44 crc kubenswrapper[4974]: I1013 18:17:44.695295 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9zxdh" event={"ID":"d40e6084-5cc0-470e-866b-5c1de19b6875","Type":"ContainerDied","Data":"e26632ed8a2d16cd00e60f9f55ac0bc93884729c38f9f781183f04d3b1cc26a6"} Oct 13 18:17:44 crc kubenswrapper[4974]: I1013 18:17:44.695357 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9zxdh" Oct 13 18:17:44 crc kubenswrapper[4974]: I1013 18:17:44.695365 4974 scope.go:117] "RemoveContainer" containerID="e9eec7726ac18defeb6b5eec37a889b2c4aaf8d3b253f1eb4ea08922d55d0b11" Oct 13 18:17:44 crc kubenswrapper[4974]: I1013 18:17:44.714374 4974 scope.go:117] "RemoveContainer" containerID="32cd5b9ead1faac7df92e66f040da13d0cd7ebb67c79af28b29d03f04404ef7b" Oct 13 18:17:44 crc kubenswrapper[4974]: I1013 18:17:44.728118 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9zxdh"] Oct 13 18:17:44 crc kubenswrapper[4974]: I1013 18:17:44.735070 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9zxdh"] Oct 13 18:17:44 crc kubenswrapper[4974]: I1013 18:17:44.746563 4974 scope.go:117] "RemoveContainer" containerID="f1b0dacb888c1ed6822872965dc27ab78e54fcbaa3127c6a6ebe2e9025654f23" Oct 13 18:17:45 crc kubenswrapper[4974]: I1013 18:17:45.817570 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d40e6084-5cc0-470e-866b-5c1de19b6875" path="/var/lib/kubelet/pods/d40e6084-5cc0-470e-866b-5c1de19b6875/volumes" Oct 13 18:18:07 crc kubenswrapper[4974]: I1013 18:18:07.742453 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:18:07 crc kubenswrapper[4974]: I1013 18:18:07.742814 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:18:07 crc kubenswrapper[4974]: I1013 18:18:07.742861 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:18:07 crc kubenswrapper[4974]: I1013 18:18:07.743352 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 18:18:07 crc kubenswrapper[4974]: I1013 18:18:07.743397 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3" gracePeriod=600 Oct 13 18:18:08 crc kubenswrapper[4974]: I1013 18:18:08.827502 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3" exitCode=0 Oct 13 18:18:08 crc kubenswrapper[4974]: I1013 18:18:08.827953 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3"} Oct 13 18:18:08 crc kubenswrapper[4974]: I1013 18:18:08.828280 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"29dc800fef44c866f1bfd3f3e803022d871dfc0b9f20d8a971dfc51b409fc8b0"} Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.664615 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pjz9z"] Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.665841 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pjz9z" podUID="eb659196-3de8-42d4-9fe6-b0c4c5e4de13" containerName="registry-server" containerID="cri-o://d78bac88bc91b2d90ad7a5a190933d35a85dd9de30e2c356f78cc814ac672f06" gracePeriod=30 Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.673298 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bhwmv"] Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.673724 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bhwmv" podUID="243c3013-e799-451c-82e3-05371075de32" containerName="registry-server" containerID="cri-o://5140cbd95fc64ba8f97b8db17379a1f3bb2ecbc251f3481aafeff2924a195fc4" gracePeriod=30 Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.680993 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wph2s"] Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.681211 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" podUID="fa74b416-3f9b-45de-a657-79a31f755b9c" containerName="marketplace-operator" containerID="cri-o://3dfec715d330631610c39ecf853902834fe263a2b2c4c604c76da49c6ddd88ae" gracePeriod=30 Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.695942 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp9tn"] Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.696362 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fp9tn" podUID="e79aa334-618e-4e55-9113-8627891e2962" containerName="registry-server" containerID="cri-o://7a58db54e098d096814f213452655b31adcf7df89a2887f1c419faffa33d1549" gracePeriod=30 Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.713822 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jvrtp"] Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714334 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7429c3e4-2ad8-4373-807a-b69a11868c49" containerName="collect-profiles" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714357 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7429c3e4-2ad8-4373-807a-b69a11868c49" containerName="collect-profiles" Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714373 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f04939-3c35-4f5d-bea5-34b2c7c27d53" containerName="registry-server" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714391 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f04939-3c35-4f5d-bea5-34b2c7c27d53" containerName="registry-server" Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714409 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4353b611-898f-42b9-8bfd-927ca6579832" containerName="registry-server" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714418 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4353b611-898f-42b9-8bfd-927ca6579832" containerName="registry-server" Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714441 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4353b611-898f-42b9-8bfd-927ca6579832" containerName="extract-content" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714456 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4353b611-898f-42b9-8bfd-927ca6579832" containerName="extract-content" Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714478 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b122347-62b2-422f-96d0-7e725a331e1f" containerName="registry-server" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714487 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b122347-62b2-422f-96d0-7e725a331e1f" containerName="registry-server" Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714504 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40e6084-5cc0-470e-866b-5c1de19b6875" containerName="registry-server" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714526 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40e6084-5cc0-470e-866b-5c1de19b6875" containerName="registry-server" Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714591 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f04939-3c35-4f5d-bea5-34b2c7c27d53" containerName="extract-utilities" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714601 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f04939-3c35-4f5d-bea5-34b2c7c27d53" containerName="extract-utilities" Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714621 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b122347-62b2-422f-96d0-7e725a331e1f" containerName="extract-utilities" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714630 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b122347-62b2-422f-96d0-7e725a331e1f" containerName="extract-utilities" Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714641 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4353b611-898f-42b9-8bfd-927ca6579832" containerName="extract-utilities" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714650 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4353b611-898f-42b9-8bfd-927ca6579832" containerName="extract-utilities" Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714680 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f04939-3c35-4f5d-bea5-34b2c7c27d53" containerName="extract-content" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714692 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f04939-3c35-4f5d-bea5-34b2c7c27d53" containerName="extract-content" Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714703 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40e6084-5cc0-470e-866b-5c1de19b6875" containerName="extract-content" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714714 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40e6084-5cc0-470e-866b-5c1de19b6875" containerName="extract-content" Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714735 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b122347-62b2-422f-96d0-7e725a331e1f" containerName="extract-content" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714747 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b122347-62b2-422f-96d0-7e725a331e1f" containerName="extract-content" Oct 13 18:18:09 crc kubenswrapper[4974]: E1013 18:18:09.714759 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d40e6084-5cc0-470e-866b-5c1de19b6875" containerName="extract-utilities" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.714768 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d40e6084-5cc0-470e-866b-5c1de19b6875" containerName="extract-utilities" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.715048 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d40e6084-5cc0-470e-866b-5c1de19b6875" containerName="registry-server" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.715076 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f04939-3c35-4f5d-bea5-34b2c7c27d53" containerName="registry-server" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.715097 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="7429c3e4-2ad8-4373-807a-b69a11868c49" containerName="collect-profiles" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.715117 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b122347-62b2-422f-96d0-7e725a331e1f" containerName="registry-server" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.715132 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="4353b611-898f-42b9-8bfd-927ca6579832" containerName="registry-server" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.716221 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.773279 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwtp5"] Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.773542 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kwtp5" podUID="8d3b1917-2acd-461e-b659-6056041ee467" containerName="registry-server" containerID="cri-o://37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3" gracePeriod=30 Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.784083 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jvrtp"] Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.835374 4974 generic.go:334] "Generic (PLEG): container finished" podID="fa74b416-3f9b-45de-a657-79a31f755b9c" containerID="3dfec715d330631610c39ecf853902834fe263a2b2c4c604c76da49c6ddd88ae" exitCode=0 Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.835425 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" event={"ID":"fa74b416-3f9b-45de-a657-79a31f755b9c","Type":"ContainerDied","Data":"3dfec715d330631610c39ecf853902834fe263a2b2c4c604c76da49c6ddd88ae"} Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.841317 4974 generic.go:334] "Generic (PLEG): container finished" podID="eb659196-3de8-42d4-9fe6-b0c4c5e4de13" containerID="d78bac88bc91b2d90ad7a5a190933d35a85dd9de30e2c356f78cc814ac672f06" exitCode=0 Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.841394 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjz9z" event={"ID":"eb659196-3de8-42d4-9fe6-b0c4c5e4de13","Type":"ContainerDied","Data":"d78bac88bc91b2d90ad7a5a190933d35a85dd9de30e2c356f78cc814ac672f06"} Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.847888 4974 generic.go:334] "Generic (PLEG): container finished" podID="243c3013-e799-451c-82e3-05371075de32" containerID="5140cbd95fc64ba8f97b8db17379a1f3bb2ecbc251f3481aafeff2924a195fc4" exitCode=0 Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.847969 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhwmv" event={"ID":"243c3013-e799-451c-82e3-05371075de32","Type":"ContainerDied","Data":"5140cbd95fc64ba8f97b8db17379a1f3bb2ecbc251f3481aafeff2924a195fc4"} Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.851548 4974 generic.go:334] "Generic (PLEG): container finished" podID="e79aa334-618e-4e55-9113-8627891e2962" containerID="7a58db54e098d096814f213452655b31adcf7df89a2887f1c419faffa33d1549" exitCode=0 Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.851594 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp9tn" event={"ID":"e79aa334-618e-4e55-9113-8627891e2962","Type":"ContainerDied","Data":"7a58db54e098d096814f213452655b31adcf7df89a2887f1c419faffa33d1549"} Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.867671 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz6bx\" (UniqueName: \"kubernetes.io/projected/09e0a416-6821-4853-8c22-d5e55e540657-kube-api-access-vz6bx\") pod \"marketplace-operator-79b997595-jvrtp\" (UID: \"09e0a416-6821-4853-8c22-d5e55e540657\") " pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.867767 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09e0a416-6821-4853-8c22-d5e55e540657-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jvrtp\" (UID: \"09e0a416-6821-4853-8c22-d5e55e540657\") " pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.867812 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/09e0a416-6821-4853-8c22-d5e55e540657-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jvrtp\" (UID: \"09e0a416-6821-4853-8c22-d5e55e540657\") " pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.968835 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09e0a416-6821-4853-8c22-d5e55e540657-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jvrtp\" (UID: \"09e0a416-6821-4853-8c22-d5e55e540657\") " pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.968899 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/09e0a416-6821-4853-8c22-d5e55e540657-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jvrtp\" (UID: \"09e0a416-6821-4853-8c22-d5e55e540657\") " pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.968938 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz6bx\" (UniqueName: \"kubernetes.io/projected/09e0a416-6821-4853-8c22-d5e55e540657-kube-api-access-vz6bx\") pod \"marketplace-operator-79b997595-jvrtp\" (UID: \"09e0a416-6821-4853-8c22-d5e55e540657\") " pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.970796 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/09e0a416-6821-4853-8c22-d5e55e540657-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jvrtp\" (UID: \"09e0a416-6821-4853-8c22-d5e55e540657\") " pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.981153 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/09e0a416-6821-4853-8c22-d5e55e540657-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jvrtp\" (UID: \"09e0a416-6821-4853-8c22-d5e55e540657\") " pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:09 crc kubenswrapper[4974]: I1013 18:18:09.988860 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz6bx\" (UniqueName: \"kubernetes.io/projected/09e0a416-6821-4853-8c22-d5e55e540657-kube-api-access-vz6bx\") pod \"marketplace-operator-79b997595-jvrtp\" (UID: \"09e0a416-6821-4853-8c22-d5e55e540657\") " pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.116838 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.182985 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.184302 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.211508 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.213727 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.249608 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373075 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e79aa334-618e-4e55-9113-8627891e2962-catalog-content\") pod \"e79aa334-618e-4e55-9113-8627891e2962\" (UID: \"e79aa334-618e-4e55-9113-8627891e2962\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373118 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/243c3013-e799-451c-82e3-05371075de32-utilities\") pod \"243c3013-e799-451c-82e3-05371075de32\" (UID: \"243c3013-e799-451c-82e3-05371075de32\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373137 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-catalog-content\") pod \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\" (UID: \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373179 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7szjc\" (UniqueName: \"kubernetes.io/projected/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-kube-api-access-7szjc\") pod \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\" (UID: \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373227 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d3b1917-2acd-461e-b659-6056041ee467-catalog-content\") pod \"8d3b1917-2acd-461e-b659-6056041ee467\" (UID: \"8d3b1917-2acd-461e-b659-6056041ee467\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373248 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fa74b416-3f9b-45de-a657-79a31f755b9c-marketplace-operator-metrics\") pod \"fa74b416-3f9b-45de-a657-79a31f755b9c\" (UID: \"fa74b416-3f9b-45de-a657-79a31f755b9c\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373742 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d3b1917-2acd-461e-b659-6056041ee467-utilities\") pod \"8d3b1917-2acd-461e-b659-6056041ee467\" (UID: \"8d3b1917-2acd-461e-b659-6056041ee467\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373772 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-utilities\") pod \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\" (UID: \"eb659196-3de8-42d4-9fe6-b0c4c5e4de13\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373799 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2qqc\" (UniqueName: \"kubernetes.io/projected/fa74b416-3f9b-45de-a657-79a31f755b9c-kube-api-access-g2qqc\") pod \"fa74b416-3f9b-45de-a657-79a31f755b9c\" (UID: \"fa74b416-3f9b-45de-a657-79a31f755b9c\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373818 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmtvc\" (UniqueName: \"kubernetes.io/projected/e79aa334-618e-4e55-9113-8627891e2962-kube-api-access-jmtvc\") pod \"e79aa334-618e-4e55-9113-8627891e2962\" (UID: \"e79aa334-618e-4e55-9113-8627891e2962\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373852 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa74b416-3f9b-45de-a657-79a31f755b9c-marketplace-trusted-ca\") pod \"fa74b416-3f9b-45de-a657-79a31f755b9c\" (UID: \"fa74b416-3f9b-45de-a657-79a31f755b9c\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373887 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9mms\" (UniqueName: \"kubernetes.io/projected/243c3013-e799-451c-82e3-05371075de32-kube-api-access-d9mms\") pod \"243c3013-e799-451c-82e3-05371075de32\" (UID: \"243c3013-e799-451c-82e3-05371075de32\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373914 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/243c3013-e799-451c-82e3-05371075de32-catalog-content\") pod \"243c3013-e799-451c-82e3-05371075de32\" (UID: \"243c3013-e799-451c-82e3-05371075de32\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373945 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwczn\" (UniqueName: \"kubernetes.io/projected/8d3b1917-2acd-461e-b659-6056041ee467-kube-api-access-gwczn\") pod \"8d3b1917-2acd-461e-b659-6056041ee467\" (UID: \"8d3b1917-2acd-461e-b659-6056041ee467\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.373963 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e79aa334-618e-4e55-9113-8627891e2962-utilities\") pod \"e79aa334-618e-4e55-9113-8627891e2962\" (UID: \"e79aa334-618e-4e55-9113-8627891e2962\") " Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.374962 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d3b1917-2acd-461e-b659-6056041ee467-utilities" (OuterVolumeSpecName: "utilities") pod "8d3b1917-2acd-461e-b659-6056041ee467" (UID: "8d3b1917-2acd-461e-b659-6056041ee467"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.375729 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-utilities" (OuterVolumeSpecName: "utilities") pod "eb659196-3de8-42d4-9fe6-b0c4c5e4de13" (UID: "eb659196-3de8-42d4-9fe6-b0c4c5e4de13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.375945 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa74b416-3f9b-45de-a657-79a31f755b9c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "fa74b416-3f9b-45de-a657-79a31f755b9c" (UID: "fa74b416-3f9b-45de-a657-79a31f755b9c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.375314 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/243c3013-e799-451c-82e3-05371075de32-utilities" (OuterVolumeSpecName: "utilities") pod "243c3013-e799-451c-82e3-05371075de32" (UID: "243c3013-e799-451c-82e3-05371075de32"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.378141 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e79aa334-618e-4e55-9113-8627891e2962-kube-api-access-jmtvc" (OuterVolumeSpecName: "kube-api-access-jmtvc") pod "e79aa334-618e-4e55-9113-8627891e2962" (UID: "e79aa334-618e-4e55-9113-8627891e2962"). InnerVolumeSpecName "kube-api-access-jmtvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.378308 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa74b416-3f9b-45de-a657-79a31f755b9c-kube-api-access-g2qqc" (OuterVolumeSpecName: "kube-api-access-g2qqc") pod "fa74b416-3f9b-45de-a657-79a31f755b9c" (UID: "fa74b416-3f9b-45de-a657-79a31f755b9c"). InnerVolumeSpecName "kube-api-access-g2qqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.378582 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/243c3013-e799-451c-82e3-05371075de32-kube-api-access-d9mms" (OuterVolumeSpecName: "kube-api-access-d9mms") pod "243c3013-e799-451c-82e3-05371075de32" (UID: "243c3013-e799-451c-82e3-05371075de32"). InnerVolumeSpecName "kube-api-access-d9mms". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.378638 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-kube-api-access-7szjc" (OuterVolumeSpecName: "kube-api-access-7szjc") pod "eb659196-3de8-42d4-9fe6-b0c4c5e4de13" (UID: "eb659196-3de8-42d4-9fe6-b0c4c5e4de13"). InnerVolumeSpecName "kube-api-access-7szjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.378970 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d3b1917-2acd-461e-b659-6056041ee467-kube-api-access-gwczn" (OuterVolumeSpecName: "kube-api-access-gwczn") pod "8d3b1917-2acd-461e-b659-6056041ee467" (UID: "8d3b1917-2acd-461e-b659-6056041ee467"). InnerVolumeSpecName "kube-api-access-gwczn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.379226 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa74b416-3f9b-45de-a657-79a31f755b9c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "fa74b416-3f9b-45de-a657-79a31f755b9c" (UID: "fa74b416-3f9b-45de-a657-79a31f755b9c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.388355 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e79aa334-618e-4e55-9113-8627891e2962-utilities" (OuterVolumeSpecName: "utilities") pod "e79aa334-618e-4e55-9113-8627891e2962" (UID: "e79aa334-618e-4e55-9113-8627891e2962"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.411842 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e79aa334-618e-4e55-9113-8627891e2962-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e79aa334-618e-4e55-9113-8627891e2962" (UID: "e79aa334-618e-4e55-9113-8627891e2962"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.424975 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb659196-3de8-42d4-9fe6-b0c4c5e4de13" (UID: "eb659196-3de8-42d4-9fe6-b0c4c5e4de13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.435990 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/243c3013-e799-451c-82e3-05371075de32-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "243c3013-e799-451c-82e3-05371075de32" (UID: "243c3013-e799-451c-82e3-05371075de32"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475563 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwczn\" (UniqueName: \"kubernetes.io/projected/8d3b1917-2acd-461e-b659-6056041ee467-kube-api-access-gwczn\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475604 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e79aa334-618e-4e55-9113-8627891e2962-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475617 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e79aa334-618e-4e55-9113-8627891e2962-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475629 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/243c3013-e799-451c-82e3-05371075de32-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475641 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475671 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7szjc\" (UniqueName: \"kubernetes.io/projected/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-kube-api-access-7szjc\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475683 4974 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fa74b416-3f9b-45de-a657-79a31f755b9c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475697 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8d3b1917-2acd-461e-b659-6056041ee467-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475707 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb659196-3de8-42d4-9fe6-b0c4c5e4de13-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475717 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2qqc\" (UniqueName: \"kubernetes.io/projected/fa74b416-3f9b-45de-a657-79a31f755b9c-kube-api-access-g2qqc\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475728 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmtvc\" (UniqueName: \"kubernetes.io/projected/e79aa334-618e-4e55-9113-8627891e2962-kube-api-access-jmtvc\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475740 4974 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fa74b416-3f9b-45de-a657-79a31f755b9c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475751 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9mms\" (UniqueName: \"kubernetes.io/projected/243c3013-e799-451c-82e3-05371075de32-kube-api-access-d9mms\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.475761 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/243c3013-e799-451c-82e3-05371075de32-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.499391 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d3b1917-2acd-461e-b659-6056041ee467-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8d3b1917-2acd-461e-b659-6056041ee467" (UID: "8d3b1917-2acd-461e-b659-6056041ee467"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.539609 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jvrtp"] Oct 13 18:18:10 crc kubenswrapper[4974]: W1013 18:18:10.547821 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09e0a416_6821_4853_8c22_d5e55e540657.slice/crio-c22f114411b7f9e69503295018dd61fc9301c1b6eb0d1afc151465a30b22e6c7 WatchSource:0}: Error finding container c22f114411b7f9e69503295018dd61fc9301c1b6eb0d1afc151465a30b22e6c7: Status 404 returned error can't find the container with id c22f114411b7f9e69503295018dd61fc9301c1b6eb0d1afc151465a30b22e6c7 Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.577115 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8d3b1917-2acd-461e-b659-6056041ee467-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.858072 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" event={"ID":"09e0a416-6821-4853-8c22-d5e55e540657","Type":"ContainerStarted","Data":"41327a4acee4f411045a392ebfc9242d35dccfa8e9880c36c03f0b576aa23bc8"} Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.858499 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.858517 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" event={"ID":"09e0a416-6821-4853-8c22-d5e55e540657","Type":"ContainerStarted","Data":"c22f114411b7f9e69503295018dd61fc9301c1b6eb0d1afc151465a30b22e6c7"} Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.859383 4974 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jvrtp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.54:8080/healthz\": dial tcp 10.217.0.54:8080: connect: connection refused" start-of-body= Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.859433 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" podUID="09e0a416-6821-4853-8c22-d5e55e540657" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.54:8080/healthz\": dial tcp 10.217.0.54:8080: connect: connection refused" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.860074 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bhwmv" event={"ID":"243c3013-e799-451c-82e3-05371075de32","Type":"ContainerDied","Data":"015b3b4ea0bcb911f376d561b6de7dff7c70484d437bf83549ec3600358030aa"} Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.860105 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bhwmv" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.860126 4974 scope.go:117] "RemoveContainer" containerID="5140cbd95fc64ba8f97b8db17379a1f3bb2ecbc251f3481aafeff2924a195fc4" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.862428 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fp9tn" event={"ID":"e79aa334-618e-4e55-9113-8627891e2962","Type":"ContainerDied","Data":"6230580fad5a03a0442f5dfe3cb4f109b6e8d34c26cf940559252a75db279e3d"} Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.862541 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fp9tn" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.867972 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.868026 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wph2s" event={"ID":"fa74b416-3f9b-45de-a657-79a31f755b9c","Type":"ContainerDied","Data":"470318fda49410b6d5543b35484039d375b46cf4d5dcccb891aaa0ae29748da7"} Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.870401 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pjz9z" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.870403 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pjz9z" event={"ID":"eb659196-3de8-42d4-9fe6-b0c4c5e4de13","Type":"ContainerDied","Data":"5d44a53cbfad40a83a03a35b6397149c8ab8fc55f7fb8c6017bbe877498f3d87"} Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.876831 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" podStartSLOduration=1.876809167 podStartE2EDuration="1.876809167s" podCreationTimestamp="2025-10-13 18:18:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:18:10.874468701 +0000 UTC m=+225.778834781" watchObservedRunningTime="2025-10-13 18:18:10.876809167 +0000 UTC m=+225.781175247" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.879204 4974 generic.go:334] "Generic (PLEG): container finished" podID="8d3b1917-2acd-461e-b659-6056041ee467" containerID="37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3" exitCode=0 Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.879258 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwtp5" event={"ID":"8d3b1917-2acd-461e-b659-6056041ee467","Type":"ContainerDied","Data":"37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3"} Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.879291 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kwtp5" event={"ID":"8d3b1917-2acd-461e-b659-6056041ee467","Type":"ContainerDied","Data":"1711b5f4ef9b9f6285e761be4255abd3e08308a63459325ab6da8d00487bf7ba"} Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.879368 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kwtp5" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.899026 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gvsws"] Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.902404 4974 scope.go:117] "RemoveContainer" containerID="24ca4a485f15c49ba659b0331a0bf64943c398b3f908d7c17875e7e0b7420141" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.933434 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp9tn"] Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.952431 4974 scope.go:117] "RemoveContainer" containerID="1a631dee6767222409c0528a36c5545a735c4d472245b638e7e3a8beadb0b349" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.956894 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fp9tn"] Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.974529 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bhwmv"] Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.976791 4974 scope.go:117] "RemoveContainer" containerID="7a58db54e098d096814f213452655b31adcf7df89a2887f1c419faffa33d1549" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.980340 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bhwmv"] Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.992839 4974 scope.go:117] "RemoveContainer" containerID="1260d317917cea8b344bc87abf5ab731e348c2f6684c30c2b2fbb8eaf54e08ad" Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.993254 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wph2s"] Oct 13 18:18:10 crc kubenswrapper[4974]: I1013 18:18:10.998338 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wph2s"] Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.002739 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pjz9z"] Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.005213 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pjz9z"] Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.011023 4974 scope.go:117] "RemoveContainer" containerID="9434388f062e3409744a9fbf047f279cb20c127b4b13363cdb79e723412c17a2" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.017331 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kwtp5"] Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.021528 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kwtp5"] Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.037138 4974 scope.go:117] "RemoveContainer" containerID="3dfec715d330631610c39ecf853902834fe263a2b2c4c604c76da49c6ddd88ae" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.054860 4974 scope.go:117] "RemoveContainer" containerID="d78bac88bc91b2d90ad7a5a190933d35a85dd9de30e2c356f78cc814ac672f06" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.084068 4974 scope.go:117] "RemoveContainer" containerID="964ad6675532fb1bdedbcd6e94ce21bfc83beba2951683d15a0772cc8af2386c" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.099917 4974 scope.go:117] "RemoveContainer" containerID="207572889f0da4448c2ccefc0f4d8f68bf81fe05a38540290031c741c1f88c3a" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.116288 4974 scope.go:117] "RemoveContainer" containerID="37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.129481 4974 scope.go:117] "RemoveContainer" containerID="7301841bbe06ac48e8e4130f0e264dc91cd5d7acb3b3884fe1cbf2e651258a8f" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.145698 4974 scope.go:117] "RemoveContainer" containerID="f5ea7f2c01ba63264105c9aca3b4731e72bded8016906c6af26a019ddfe25877" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.159731 4974 scope.go:117] "RemoveContainer" containerID="37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.160102 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3\": container with ID starting with 37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3 not found: ID does not exist" containerID="37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.160130 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3"} err="failed to get container status \"37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3\": rpc error: code = NotFound desc = could not find container \"37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3\": container with ID starting with 37fabf9bf1466661291fd29cfef553753460b80e170cf607c81ecde0b5d9e0a3 not found: ID does not exist" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.160155 4974 scope.go:117] "RemoveContainer" containerID="7301841bbe06ac48e8e4130f0e264dc91cd5d7acb3b3884fe1cbf2e651258a8f" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.160577 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7301841bbe06ac48e8e4130f0e264dc91cd5d7acb3b3884fe1cbf2e651258a8f\": container with ID starting with 7301841bbe06ac48e8e4130f0e264dc91cd5d7acb3b3884fe1cbf2e651258a8f not found: ID does not exist" containerID="7301841bbe06ac48e8e4130f0e264dc91cd5d7acb3b3884fe1cbf2e651258a8f" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.160627 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7301841bbe06ac48e8e4130f0e264dc91cd5d7acb3b3884fe1cbf2e651258a8f"} err="failed to get container status \"7301841bbe06ac48e8e4130f0e264dc91cd5d7acb3b3884fe1cbf2e651258a8f\": rpc error: code = NotFound desc = could not find container \"7301841bbe06ac48e8e4130f0e264dc91cd5d7acb3b3884fe1cbf2e651258a8f\": container with ID starting with 7301841bbe06ac48e8e4130f0e264dc91cd5d7acb3b3884fe1cbf2e651258a8f not found: ID does not exist" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.160674 4974 scope.go:117] "RemoveContainer" containerID="f5ea7f2c01ba63264105c9aca3b4731e72bded8016906c6af26a019ddfe25877" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.161147 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ea7f2c01ba63264105c9aca3b4731e72bded8016906c6af26a019ddfe25877\": container with ID starting with f5ea7f2c01ba63264105c9aca3b4731e72bded8016906c6af26a019ddfe25877 not found: ID does not exist" containerID="f5ea7f2c01ba63264105c9aca3b4731e72bded8016906c6af26a019ddfe25877" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.161173 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ea7f2c01ba63264105c9aca3b4731e72bded8016906c6af26a019ddfe25877"} err="failed to get container status \"f5ea7f2c01ba63264105c9aca3b4731e72bded8016906c6af26a019ddfe25877\": rpc error: code = NotFound desc = could not find container \"f5ea7f2c01ba63264105c9aca3b4731e72bded8016906c6af26a019ddfe25877\": container with ID starting with f5ea7f2c01ba63264105c9aca3b4731e72bded8016906c6af26a019ddfe25877 not found: ID does not exist" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.477522 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6jsrw"] Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.477930 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e79aa334-618e-4e55-9113-8627891e2962" containerName="extract-content" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.478004 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="e79aa334-618e-4e55-9113-8627891e2962" containerName="extract-content" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.478068 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb659196-3de8-42d4-9fe6-b0c4c5e4de13" containerName="extract-utilities" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.478121 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb659196-3de8-42d4-9fe6-b0c4c5e4de13" containerName="extract-utilities" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.478176 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb659196-3de8-42d4-9fe6-b0c4c5e4de13" containerName="registry-server" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.478233 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb659196-3de8-42d4-9fe6-b0c4c5e4de13" containerName="registry-server" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.478286 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d3b1917-2acd-461e-b659-6056041ee467" containerName="registry-server" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.478339 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3b1917-2acd-461e-b659-6056041ee467" containerName="registry-server" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.478396 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243c3013-e799-451c-82e3-05371075de32" containerName="extract-content" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.478447 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="243c3013-e799-451c-82e3-05371075de32" containerName="extract-content" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.478498 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d3b1917-2acd-461e-b659-6056041ee467" containerName="extract-utilities" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.478553 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3b1917-2acd-461e-b659-6056041ee467" containerName="extract-utilities" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.478607 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa74b416-3f9b-45de-a657-79a31f755b9c" containerName="marketplace-operator" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.478677 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa74b416-3f9b-45de-a657-79a31f755b9c" containerName="marketplace-operator" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.478748 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243c3013-e799-451c-82e3-05371075de32" containerName="extract-utilities" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.478800 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="243c3013-e799-451c-82e3-05371075de32" containerName="extract-utilities" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.478854 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb659196-3de8-42d4-9fe6-b0c4c5e4de13" containerName="extract-content" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.478905 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb659196-3de8-42d4-9fe6-b0c4c5e4de13" containerName="extract-content" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.478968 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e79aa334-618e-4e55-9113-8627891e2962" containerName="registry-server" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.479035 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="e79aa334-618e-4e55-9113-8627891e2962" containerName="registry-server" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.479092 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="243c3013-e799-451c-82e3-05371075de32" containerName="registry-server" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.479143 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="243c3013-e799-451c-82e3-05371075de32" containerName="registry-server" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.479196 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e79aa334-618e-4e55-9113-8627891e2962" containerName="extract-utilities" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.479246 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="e79aa334-618e-4e55-9113-8627891e2962" containerName="extract-utilities" Oct 13 18:18:11 crc kubenswrapper[4974]: E1013 18:18:11.479298 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d3b1917-2acd-461e-b659-6056041ee467" containerName="extract-content" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.479348 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d3b1917-2acd-461e-b659-6056041ee467" containerName="extract-content" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.479501 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="243c3013-e799-451c-82e3-05371075de32" containerName="registry-server" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.479574 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb659196-3de8-42d4-9fe6-b0c4c5e4de13" containerName="registry-server" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.479632 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="e79aa334-618e-4e55-9113-8627891e2962" containerName="registry-server" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.479777 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d3b1917-2acd-461e-b659-6056041ee467" containerName="registry-server" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.479872 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa74b416-3f9b-45de-a657-79a31f755b9c" containerName="marketplace-operator" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.480591 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.482798 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.486020 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jsrw"] Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.594417 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spx7m\" (UniqueName: \"kubernetes.io/projected/b9b28cad-3016-495e-b2cb-33b07dcc4d2d-kube-api-access-spx7m\") pod \"redhat-marketplace-6jsrw\" (UID: \"b9b28cad-3016-495e-b2cb-33b07dcc4d2d\") " pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.594755 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9b28cad-3016-495e-b2cb-33b07dcc4d2d-utilities\") pod \"redhat-marketplace-6jsrw\" (UID: \"b9b28cad-3016-495e-b2cb-33b07dcc4d2d\") " pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.594899 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9b28cad-3016-495e-b2cb-33b07dcc4d2d-catalog-content\") pod \"redhat-marketplace-6jsrw\" (UID: \"b9b28cad-3016-495e-b2cb-33b07dcc4d2d\") " pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.696319 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spx7m\" (UniqueName: \"kubernetes.io/projected/b9b28cad-3016-495e-b2cb-33b07dcc4d2d-kube-api-access-spx7m\") pod \"redhat-marketplace-6jsrw\" (UID: \"b9b28cad-3016-495e-b2cb-33b07dcc4d2d\") " pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.697103 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9b28cad-3016-495e-b2cb-33b07dcc4d2d-utilities\") pod \"redhat-marketplace-6jsrw\" (UID: \"b9b28cad-3016-495e-b2cb-33b07dcc4d2d\") " pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.697579 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9b28cad-3016-495e-b2cb-33b07dcc4d2d-catalog-content\") pod \"redhat-marketplace-6jsrw\" (UID: \"b9b28cad-3016-495e-b2cb-33b07dcc4d2d\") " pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.697759 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b9b28cad-3016-495e-b2cb-33b07dcc4d2d-utilities\") pod \"redhat-marketplace-6jsrw\" (UID: \"b9b28cad-3016-495e-b2cb-33b07dcc4d2d\") " pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.698031 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b9b28cad-3016-495e-b2cb-33b07dcc4d2d-catalog-content\") pod \"redhat-marketplace-6jsrw\" (UID: \"b9b28cad-3016-495e-b2cb-33b07dcc4d2d\") " pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.720670 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spx7m\" (UniqueName: \"kubernetes.io/projected/b9b28cad-3016-495e-b2cb-33b07dcc4d2d-kube-api-access-spx7m\") pod \"redhat-marketplace-6jsrw\" (UID: \"b9b28cad-3016-495e-b2cb-33b07dcc4d2d\") " pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.796857 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.817952 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="243c3013-e799-451c-82e3-05371075de32" path="/var/lib/kubelet/pods/243c3013-e799-451c-82e3-05371075de32/volumes" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.819533 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d3b1917-2acd-461e-b659-6056041ee467" path="/var/lib/kubelet/pods/8d3b1917-2acd-461e-b659-6056041ee467/volumes" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.820813 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e79aa334-618e-4e55-9113-8627891e2962" path="/var/lib/kubelet/pods/e79aa334-618e-4e55-9113-8627891e2962/volumes" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.823561 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb659196-3de8-42d4-9fe6-b0c4c5e4de13" path="/var/lib/kubelet/pods/eb659196-3de8-42d4-9fe6-b0c4c5e4de13/volumes" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.825836 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa74b416-3f9b-45de-a657-79a31f755b9c" path="/var/lib/kubelet/pods/fa74b416-3f9b-45de-a657-79a31f755b9c/volumes" Oct 13 18:18:11 crc kubenswrapper[4974]: I1013 18:18:11.908920 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jvrtp" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.255351 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6jsrw"] Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.481399 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-26g4k"] Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.484056 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-26g4k"] Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.484211 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.485734 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.642180 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850191fb-a1bf-43b6-910c-cf2a1da233f5-catalog-content\") pod \"certified-operators-26g4k\" (UID: \"850191fb-a1bf-43b6-910c-cf2a1da233f5\") " pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.642228 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m87qj\" (UniqueName: \"kubernetes.io/projected/850191fb-a1bf-43b6-910c-cf2a1da233f5-kube-api-access-m87qj\") pod \"certified-operators-26g4k\" (UID: \"850191fb-a1bf-43b6-910c-cf2a1da233f5\") " pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.642283 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850191fb-a1bf-43b6-910c-cf2a1da233f5-utilities\") pod \"certified-operators-26g4k\" (UID: \"850191fb-a1bf-43b6-910c-cf2a1da233f5\") " pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.743919 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850191fb-a1bf-43b6-910c-cf2a1da233f5-catalog-content\") pod \"certified-operators-26g4k\" (UID: \"850191fb-a1bf-43b6-910c-cf2a1da233f5\") " pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.743981 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m87qj\" (UniqueName: \"kubernetes.io/projected/850191fb-a1bf-43b6-910c-cf2a1da233f5-kube-api-access-m87qj\") pod \"certified-operators-26g4k\" (UID: \"850191fb-a1bf-43b6-910c-cf2a1da233f5\") " pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.744046 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850191fb-a1bf-43b6-910c-cf2a1da233f5-utilities\") pod \"certified-operators-26g4k\" (UID: \"850191fb-a1bf-43b6-910c-cf2a1da233f5\") " pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.744455 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850191fb-a1bf-43b6-910c-cf2a1da233f5-utilities\") pod \"certified-operators-26g4k\" (UID: \"850191fb-a1bf-43b6-910c-cf2a1da233f5\") " pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.744530 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850191fb-a1bf-43b6-910c-cf2a1da233f5-catalog-content\") pod \"certified-operators-26g4k\" (UID: \"850191fb-a1bf-43b6-910c-cf2a1da233f5\") " pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.769580 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m87qj\" (UniqueName: \"kubernetes.io/projected/850191fb-a1bf-43b6-910c-cf2a1da233f5-kube-api-access-m87qj\") pod \"certified-operators-26g4k\" (UID: \"850191fb-a1bf-43b6-910c-cf2a1da233f5\") " pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.803732 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.920279 4974 generic.go:334] "Generic (PLEG): container finished" podID="b9b28cad-3016-495e-b2cb-33b07dcc4d2d" containerID="fecb6b11ab8b939b9eb445196c39a4ed782239f17cebb14de192e92538d88cd2" exitCode=0 Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.920500 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jsrw" event={"ID":"b9b28cad-3016-495e-b2cb-33b07dcc4d2d","Type":"ContainerDied","Data":"fecb6b11ab8b939b9eb445196c39a4ed782239f17cebb14de192e92538d88cd2"} Oct 13 18:18:12 crc kubenswrapper[4974]: I1013 18:18:12.920730 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jsrw" event={"ID":"b9b28cad-3016-495e-b2cb-33b07dcc4d2d","Type":"ContainerStarted","Data":"416dc2eb123ed8edfcf71ce34d7c778c382ba26b9b162fa1fb38e9b65a8dc2d7"} Oct 13 18:18:13 crc kubenswrapper[4974]: I1013 18:18:13.105451 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-26g4k"] Oct 13 18:18:13 crc kubenswrapper[4974]: W1013 18:18:13.117448 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod850191fb_a1bf_43b6_910c_cf2a1da233f5.slice/crio-0aa8f3325d0475410e826cb5be709669268d8d1933b7532fdeff34fefd66cd8c WatchSource:0}: Error finding container 0aa8f3325d0475410e826cb5be709669268d8d1933b7532fdeff34fefd66cd8c: Status 404 returned error can't find the container with id 0aa8f3325d0475410e826cb5be709669268d8d1933b7532fdeff34fefd66cd8c Oct 13 18:18:13 crc kubenswrapper[4974]: I1013 18:18:13.880094 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8dgp2"] Oct 13 18:18:13 crc kubenswrapper[4974]: I1013 18:18:13.881187 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:13 crc kubenswrapper[4974]: I1013 18:18:13.883816 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Oct 13 18:18:13 crc kubenswrapper[4974]: I1013 18:18:13.889099 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8dgp2"] Oct 13 18:18:13 crc kubenswrapper[4974]: I1013 18:18:13.927604 4974 generic.go:334] "Generic (PLEG): container finished" podID="850191fb-a1bf-43b6-910c-cf2a1da233f5" containerID="095b4b702996fc4958f47e5cfbbae5da390f3b83e341cf4428004a04eb3827bd" exitCode=0 Oct 13 18:18:13 crc kubenswrapper[4974]: I1013 18:18:13.927646 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26g4k" event={"ID":"850191fb-a1bf-43b6-910c-cf2a1da233f5","Type":"ContainerDied","Data":"095b4b702996fc4958f47e5cfbbae5da390f3b83e341cf4428004a04eb3827bd"} Oct 13 18:18:13 crc kubenswrapper[4974]: I1013 18:18:13.927688 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26g4k" event={"ID":"850191fb-a1bf-43b6-910c-cf2a1da233f5","Type":"ContainerStarted","Data":"0aa8f3325d0475410e826cb5be709669268d8d1933b7532fdeff34fefd66cd8c"} Oct 13 18:18:13 crc kubenswrapper[4974]: I1013 18:18:13.969557 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45t7h\" (UniqueName: \"kubernetes.io/projected/2d8eecf7-a8ba-4155-bad2-1931a61fd0e8-kube-api-access-45t7h\") pod \"redhat-operators-8dgp2\" (UID: \"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8\") " pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:13 crc kubenswrapper[4974]: I1013 18:18:13.969618 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8eecf7-a8ba-4155-bad2-1931a61fd0e8-catalog-content\") pod \"redhat-operators-8dgp2\" (UID: \"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8\") " pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:13 crc kubenswrapper[4974]: I1013 18:18:13.969687 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8eecf7-a8ba-4155-bad2-1931a61fd0e8-utilities\") pod \"redhat-operators-8dgp2\" (UID: \"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8\") " pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.070793 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45t7h\" (UniqueName: \"kubernetes.io/projected/2d8eecf7-a8ba-4155-bad2-1931a61fd0e8-kube-api-access-45t7h\") pod \"redhat-operators-8dgp2\" (UID: \"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8\") " pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.070863 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8eecf7-a8ba-4155-bad2-1931a61fd0e8-catalog-content\") pod \"redhat-operators-8dgp2\" (UID: \"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8\") " pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.070927 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8eecf7-a8ba-4155-bad2-1931a61fd0e8-utilities\") pod \"redhat-operators-8dgp2\" (UID: \"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8\") " pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.071763 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2d8eecf7-a8ba-4155-bad2-1931a61fd0e8-utilities\") pod \"redhat-operators-8dgp2\" (UID: \"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8\") " pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.071775 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2d8eecf7-a8ba-4155-bad2-1931a61fd0e8-catalog-content\") pod \"redhat-operators-8dgp2\" (UID: \"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8\") " pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.091296 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45t7h\" (UniqueName: \"kubernetes.io/projected/2d8eecf7-a8ba-4155-bad2-1931a61fd0e8-kube-api-access-45t7h\") pod \"redhat-operators-8dgp2\" (UID: \"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8\") " pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.198590 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.439241 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8dgp2"] Oct 13 18:18:14 crc kubenswrapper[4974]: W1013 18:18:14.444833 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8eecf7_a8ba_4155_bad2_1931a61fd0e8.slice/crio-3816a7c1798f283284abc36f112da9eb2fc28d2a255e11c7b4d27fd921c3b042 WatchSource:0}: Error finding container 3816a7c1798f283284abc36f112da9eb2fc28d2a255e11c7b4d27fd921c3b042: Status 404 returned error can't find the container with id 3816a7c1798f283284abc36f112da9eb2fc28d2a255e11c7b4d27fd921c3b042 Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.886619 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-llxfj"] Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.887727 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.890236 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.895787 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llxfj"] Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.934149 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26g4k" event={"ID":"850191fb-a1bf-43b6-910c-cf2a1da233f5","Type":"ContainerStarted","Data":"b053562002e9ac192eec734232bc07504d2c0702ac82c0fa5f1343730471ae17"} Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.936179 4974 generic.go:334] "Generic (PLEG): container finished" podID="2d8eecf7-a8ba-4155-bad2-1931a61fd0e8" containerID="cdb4f563f8c16aeda0fb760a813984a9b9e971f5cbf34065cbc1ce6af1299705" exitCode=0 Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.936247 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dgp2" event={"ID":"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8","Type":"ContainerDied","Data":"cdb4f563f8c16aeda0fb760a813984a9b9e971f5cbf34065cbc1ce6af1299705"} Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.936315 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dgp2" event={"ID":"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8","Type":"ContainerStarted","Data":"3816a7c1798f283284abc36f112da9eb2fc28d2a255e11c7b4d27fd921c3b042"} Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.938973 4974 generic.go:334] "Generic (PLEG): container finished" podID="b9b28cad-3016-495e-b2cb-33b07dcc4d2d" containerID="356a790e9f792a99656165bf29f0b396b91a594ec5313f0262ce32d7ecc7f174" exitCode=0 Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.939003 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jsrw" event={"ID":"b9b28cad-3016-495e-b2cb-33b07dcc4d2d","Type":"ContainerDied","Data":"356a790e9f792a99656165bf29f0b396b91a594ec5313f0262ce32d7ecc7f174"} Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.985169 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbcbc\" (UniqueName: \"kubernetes.io/projected/b21f08a9-cf8a-4598-8b0a-f43015102fc6-kube-api-access-pbcbc\") pod \"community-operators-llxfj\" (UID: \"b21f08a9-cf8a-4598-8b0a-f43015102fc6\") " pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.985247 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b21f08a9-cf8a-4598-8b0a-f43015102fc6-catalog-content\") pod \"community-operators-llxfj\" (UID: \"b21f08a9-cf8a-4598-8b0a-f43015102fc6\") " pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:14 crc kubenswrapper[4974]: I1013 18:18:14.985264 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b21f08a9-cf8a-4598-8b0a-f43015102fc6-utilities\") pod \"community-operators-llxfj\" (UID: \"b21f08a9-cf8a-4598-8b0a-f43015102fc6\") " pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.086486 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b21f08a9-cf8a-4598-8b0a-f43015102fc6-catalog-content\") pod \"community-operators-llxfj\" (UID: \"b21f08a9-cf8a-4598-8b0a-f43015102fc6\") " pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.087398 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b21f08a9-cf8a-4598-8b0a-f43015102fc6-utilities\") pod \"community-operators-llxfj\" (UID: \"b21f08a9-cf8a-4598-8b0a-f43015102fc6\") " pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.087574 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbcbc\" (UniqueName: \"kubernetes.io/projected/b21f08a9-cf8a-4598-8b0a-f43015102fc6-kube-api-access-pbcbc\") pod \"community-operators-llxfj\" (UID: \"b21f08a9-cf8a-4598-8b0a-f43015102fc6\") " pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.088857 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b21f08a9-cf8a-4598-8b0a-f43015102fc6-catalog-content\") pod \"community-operators-llxfj\" (UID: \"b21f08a9-cf8a-4598-8b0a-f43015102fc6\") " pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.089334 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b21f08a9-cf8a-4598-8b0a-f43015102fc6-utilities\") pod \"community-operators-llxfj\" (UID: \"b21f08a9-cf8a-4598-8b0a-f43015102fc6\") " pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.108778 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbcbc\" (UniqueName: \"kubernetes.io/projected/b21f08a9-cf8a-4598-8b0a-f43015102fc6-kube-api-access-pbcbc\") pod \"community-operators-llxfj\" (UID: \"b21f08a9-cf8a-4598-8b0a-f43015102fc6\") " pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.210539 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.653410 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-llxfj"] Oct 13 18:18:15 crc kubenswrapper[4974]: W1013 18:18:15.659521 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb21f08a9_cf8a_4598_8b0a_f43015102fc6.slice/crio-0595a4d7effd6dbceaaf02cf20adc55437c949f882d19be0411ed53e09398582 WatchSource:0}: Error finding container 0595a4d7effd6dbceaaf02cf20adc55437c949f882d19be0411ed53e09398582: Status 404 returned error can't find the container with id 0595a4d7effd6dbceaaf02cf20adc55437c949f882d19be0411ed53e09398582 Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.945562 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6jsrw" event={"ID":"b9b28cad-3016-495e-b2cb-33b07dcc4d2d","Type":"ContainerStarted","Data":"f30325c969be350c77639e4d57e4508ac1ce3f00259296dea23d17b14a13fa75"} Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.947023 4974 generic.go:334] "Generic (PLEG): container finished" podID="b21f08a9-cf8a-4598-8b0a-f43015102fc6" containerID="43d74b8f1cb1911a1bb000e39a45f51fd655080f1366a55cb5583fa745f4ef71" exitCode=0 Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.947067 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llxfj" event={"ID":"b21f08a9-cf8a-4598-8b0a-f43015102fc6","Type":"ContainerDied","Data":"43d74b8f1cb1911a1bb000e39a45f51fd655080f1366a55cb5583fa745f4ef71"} Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.947086 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llxfj" event={"ID":"b21f08a9-cf8a-4598-8b0a-f43015102fc6","Type":"ContainerStarted","Data":"0595a4d7effd6dbceaaf02cf20adc55437c949f882d19be0411ed53e09398582"} Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.952204 4974 generic.go:334] "Generic (PLEG): container finished" podID="850191fb-a1bf-43b6-910c-cf2a1da233f5" containerID="b053562002e9ac192eec734232bc07504d2c0702ac82c0fa5f1343730471ae17" exitCode=0 Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.952268 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26g4k" event={"ID":"850191fb-a1bf-43b6-910c-cf2a1da233f5","Type":"ContainerDied","Data":"b053562002e9ac192eec734232bc07504d2c0702ac82c0fa5f1343730471ae17"} Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.955230 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dgp2" event={"ID":"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8","Type":"ContainerStarted","Data":"950b61aa680dce60cf275471cbb1868743edf063b15e35730fccb6c0f0bc9909"} Oct 13 18:18:15 crc kubenswrapper[4974]: I1013 18:18:15.966285 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6jsrw" podStartSLOduration=2.486510848 podStartE2EDuration="4.966266185s" podCreationTimestamp="2025-10-13 18:18:11 +0000 UTC" firstStartedPulling="2025-10-13 18:18:12.933848477 +0000 UTC m=+227.838214637" lastFinishedPulling="2025-10-13 18:18:15.413603894 +0000 UTC m=+230.317969974" observedRunningTime="2025-10-13 18:18:15.963843697 +0000 UTC m=+230.868209777" watchObservedRunningTime="2025-10-13 18:18:15.966266185 +0000 UTC m=+230.870632275" Oct 13 18:18:16 crc kubenswrapper[4974]: I1013 18:18:16.974286 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llxfj" event={"ID":"b21f08a9-cf8a-4598-8b0a-f43015102fc6","Type":"ContainerStarted","Data":"528e3542081841d770ccbec866633c4841d2a710f53a3aef79695a8dc2d49b40"} Oct 13 18:18:16 crc kubenswrapper[4974]: I1013 18:18:16.977921 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26g4k" event={"ID":"850191fb-a1bf-43b6-910c-cf2a1da233f5","Type":"ContainerStarted","Data":"ec21deab25ff91f6a322455846260e426a25c4ef7abbf90e8b1fbffef7aaa324"} Oct 13 18:18:16 crc kubenswrapper[4974]: I1013 18:18:16.979560 4974 generic.go:334] "Generic (PLEG): container finished" podID="2d8eecf7-a8ba-4155-bad2-1931a61fd0e8" containerID="950b61aa680dce60cf275471cbb1868743edf063b15e35730fccb6c0f0bc9909" exitCode=0 Oct 13 18:18:16 crc kubenswrapper[4974]: I1013 18:18:16.979603 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dgp2" event={"ID":"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8","Type":"ContainerDied","Data":"950b61aa680dce60cf275471cbb1868743edf063b15e35730fccb6c0f0bc9909"} Oct 13 18:18:17 crc kubenswrapper[4974]: I1013 18:18:17.038781 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-26g4k" podStartSLOduration=2.445545559 podStartE2EDuration="5.038761949s" podCreationTimestamp="2025-10-13 18:18:12 +0000 UTC" firstStartedPulling="2025-10-13 18:18:13.940352305 +0000 UTC m=+228.844718375" lastFinishedPulling="2025-10-13 18:18:16.533568695 +0000 UTC m=+231.437934765" observedRunningTime="2025-10-13 18:18:17.011737174 +0000 UTC m=+231.916103254" watchObservedRunningTime="2025-10-13 18:18:17.038761949 +0000 UTC m=+231.943128029" Oct 13 18:18:17 crc kubenswrapper[4974]: I1013 18:18:17.986566 4974 generic.go:334] "Generic (PLEG): container finished" podID="b21f08a9-cf8a-4598-8b0a-f43015102fc6" containerID="528e3542081841d770ccbec866633c4841d2a710f53a3aef79695a8dc2d49b40" exitCode=0 Oct 13 18:18:17 crc kubenswrapper[4974]: I1013 18:18:17.986842 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llxfj" event={"ID":"b21f08a9-cf8a-4598-8b0a-f43015102fc6","Type":"ContainerDied","Data":"528e3542081841d770ccbec866633c4841d2a710f53a3aef79695a8dc2d49b40"} Oct 13 18:18:18 crc kubenswrapper[4974]: I1013 18:18:18.994548 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-llxfj" event={"ID":"b21f08a9-cf8a-4598-8b0a-f43015102fc6","Type":"ContainerStarted","Data":"1d4eabda4f15dd3a74a09bfe56b98c6b7d66aeb8b42ea65fd2ad141d2a59cce1"} Oct 13 18:18:18 crc kubenswrapper[4974]: I1013 18:18:18.996627 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8dgp2" event={"ID":"2d8eecf7-a8ba-4155-bad2-1931a61fd0e8","Type":"ContainerStarted","Data":"ddbb4d24726c4386bdf37779e972982c3f102135b20605b89d693b31d35e5708"} Oct 13 18:18:19 crc kubenswrapper[4974]: I1013 18:18:19.014806 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-llxfj" podStartSLOduration=2.461858671 podStartE2EDuration="5.014790424s" podCreationTimestamp="2025-10-13 18:18:14 +0000 UTC" firstStartedPulling="2025-10-13 18:18:15.947979164 +0000 UTC m=+230.852345244" lastFinishedPulling="2025-10-13 18:18:18.500910927 +0000 UTC m=+233.405276997" observedRunningTime="2025-10-13 18:18:19.011278195 +0000 UTC m=+233.915644295" watchObservedRunningTime="2025-10-13 18:18:19.014790424 +0000 UTC m=+233.919156494" Oct 13 18:18:19 crc kubenswrapper[4974]: I1013 18:18:19.036698 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8dgp2" podStartSLOduration=3.448587369 podStartE2EDuration="6.036672605s" podCreationTimestamp="2025-10-13 18:18:13 +0000 UTC" firstStartedPulling="2025-10-13 18:18:14.937629497 +0000 UTC m=+229.841995587" lastFinishedPulling="2025-10-13 18:18:17.525714722 +0000 UTC m=+232.430080823" observedRunningTime="2025-10-13 18:18:19.033106246 +0000 UTC m=+233.937472336" watchObservedRunningTime="2025-10-13 18:18:19.036672605 +0000 UTC m=+233.941038695" Oct 13 18:18:21 crc kubenswrapper[4974]: I1013 18:18:21.797073 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:21 crc kubenswrapper[4974]: I1013 18:18:21.797394 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:21 crc kubenswrapper[4974]: I1013 18:18:21.865435 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:22 crc kubenswrapper[4974]: I1013 18:18:22.059290 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6jsrw" Oct 13 18:18:22 crc kubenswrapper[4974]: I1013 18:18:22.805001 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:22 crc kubenswrapper[4974]: I1013 18:18:22.805092 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:22 crc kubenswrapper[4974]: I1013 18:18:22.858243 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:23 crc kubenswrapper[4974]: I1013 18:18:23.078113 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-26g4k" Oct 13 18:18:24 crc kubenswrapper[4974]: I1013 18:18:24.201422 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:24 crc kubenswrapper[4974]: I1013 18:18:24.202007 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:24 crc kubenswrapper[4974]: I1013 18:18:24.261138 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:25 crc kubenswrapper[4974]: I1013 18:18:25.095069 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8dgp2" Oct 13 18:18:25 crc kubenswrapper[4974]: I1013 18:18:25.212153 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:25 crc kubenswrapper[4974]: I1013 18:18:25.212432 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:25 crc kubenswrapper[4974]: I1013 18:18:25.255454 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:26 crc kubenswrapper[4974]: I1013 18:18:26.080382 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-llxfj" Oct 13 18:18:35 crc kubenswrapper[4974]: I1013 18:18:35.937380 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" podUID="b6539e39-b08d-4c27-a689-6401b299e123" containerName="oauth-openshift" containerID="cri-o://67e77fd01430d95f14c33f48443e369110278cc281f005e9882d6a78ff6a8215" gracePeriod=15 Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.095724 4974 generic.go:334] "Generic (PLEG): container finished" podID="b6539e39-b08d-4c27-a689-6401b299e123" containerID="67e77fd01430d95f14c33f48443e369110278cc281f005e9882d6a78ff6a8215" exitCode=0 Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.095812 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" event={"ID":"b6539e39-b08d-4c27-a689-6401b299e123","Type":"ContainerDied","Data":"67e77fd01430d95f14c33f48443e369110278cc281f005e9882d6a78ff6a8215"} Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.398912 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.431500 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-657494565c-sw4dj"] Oct 13 18:18:36 crc kubenswrapper[4974]: E1013 18:18:36.431738 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6539e39-b08d-4c27-a689-6401b299e123" containerName="oauth-openshift" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.431751 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6539e39-b08d-4c27-a689-6401b299e123" containerName="oauth-openshift" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.431848 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6539e39-b08d-4c27-a689-6401b299e123" containerName="oauth-openshift" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.432238 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.451712 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-657494565c-sw4dj"] Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.479482 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-trusted-ca-bundle\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.479536 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-session\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.479568 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-service-ca\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.479596 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-provider-selection\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.479665 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-login\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.479699 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-audit-policies\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.479717 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-serving-cert\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.479732 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-ocp-branding-template\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.479754 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vhrm\" (UniqueName: \"kubernetes.io/projected/b6539e39-b08d-4c27-a689-6401b299e123-kube-api-access-2vhrm\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.479776 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-cliconfig\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.479797 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-error\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.479816 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-idp-0-file-data\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.480670 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.480691 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b6539e39-b08d-4c27-a689-6401b299e123-audit-dir\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.480768 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-router-certs\") pod \"b6539e39-b08d-4c27-a689-6401b299e123\" (UID: \"b6539e39-b08d-4c27-a689-6401b299e123\") " Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481045 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481072 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-router-certs\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481097 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6539e39-b08d-4c27-a689-6401b299e123-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481152 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481162 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-user-template-error\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481195 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481202 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481263 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481346 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-session\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481409 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-user-template-login\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481465 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-service-ca\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481521 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481561 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88ff76ee-886e-4097-a180-ba26b79bb59c-audit-dir\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481598 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88ff76ee-886e-4097-a180-ba26b79bb59c-audit-policies\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481640 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481724 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs44z\" (UniqueName: \"kubernetes.io/projected/88ff76ee-886e-4097-a180-ba26b79bb59c-kube-api-access-bs44z\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481797 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481830 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481928 4974 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b6539e39-b08d-4c27-a689-6401b299e123-audit-dir\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481951 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.481970 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.482056 4974 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-audit-policies\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.482076 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.502283 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.502990 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6539e39-b08d-4c27-a689-6401b299e123-kube-api-access-2vhrm" (OuterVolumeSpecName: "kube-api-access-2vhrm") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "kube-api-access-2vhrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.503136 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.503294 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.503492 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.506976 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.507074 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.507492 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.511468 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b6539e39-b08d-4c27-a689-6401b299e123" (UID: "b6539e39-b08d-4c27-a689-6401b299e123"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.583188 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-router-certs\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.583236 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-user-template-error\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.583256 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.583275 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584092 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-session\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584128 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-user-template-login\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584153 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-service-ca\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584175 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584195 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88ff76ee-886e-4097-a180-ba26b79bb59c-audit-dir\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584236 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88ff76ee-886e-4097-a180-ba26b79bb59c-audit-policies\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584257 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584281 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs44z\" (UniqueName: \"kubernetes.io/projected/88ff76ee-886e-4097-a180-ba26b79bb59c-kube-api-access-bs44z\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584303 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584336 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584382 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584392 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584402 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584411 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vhrm\" (UniqueName: \"kubernetes.io/projected/b6539e39-b08d-4c27-a689-6401b299e123-kube-api-access-2vhrm\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584421 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584429 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584439 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584448 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.584457 4974 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b6539e39-b08d-4c27-a689-6401b299e123-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.585053 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/88ff76ee-886e-4097-a180-ba26b79bb59c-audit-policies\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.585470 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-service-ca\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.585518 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/88ff76ee-886e-4097-a180-ba26b79bb59c-audit-dir\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.585921 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.587693 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.587710 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-user-template-error\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.587864 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.587952 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.587972 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-session\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.588080 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.588623 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-user-template-login\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.589536 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-router-certs\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.590817 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/88ff76ee-886e-4097-a180-ba26b79bb59c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.604369 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs44z\" (UniqueName: \"kubernetes.io/projected/88ff76ee-886e-4097-a180-ba26b79bb59c-kube-api-access-bs44z\") pod \"oauth-openshift-657494565c-sw4dj\" (UID: \"88ff76ee-886e-4097-a180-ba26b79bb59c\") " pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:36 crc kubenswrapper[4974]: I1013 18:18:36.756171 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:37 crc kubenswrapper[4974]: I1013 18:18:37.105547 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" event={"ID":"b6539e39-b08d-4c27-a689-6401b299e123","Type":"ContainerDied","Data":"9ec44c258057023ffe8640f3e778c4d40eadc0a609da5c3f921f605c67c78167"} Oct 13 18:18:37 crc kubenswrapper[4974]: I1013 18:18:37.106248 4974 scope.go:117] "RemoveContainer" containerID="67e77fd01430d95f14c33f48443e369110278cc281f005e9882d6a78ff6a8215" Oct 13 18:18:37 crc kubenswrapper[4974]: I1013 18:18:37.105744 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" Oct 13 18:18:37 crc kubenswrapper[4974]: I1013 18:18:37.132972 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gvsws"] Oct 13 18:18:37 crc kubenswrapper[4974]: I1013 18:18:37.139988 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-gvsws"] Oct 13 18:18:37 crc kubenswrapper[4974]: I1013 18:18:37.282030 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-657494565c-sw4dj"] Oct 13 18:18:37 crc kubenswrapper[4974]: I1013 18:18:37.287052 4974 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-gvsws container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Oct 13 18:18:37 crc kubenswrapper[4974]: I1013 18:18:37.287130 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-gvsws" podUID="b6539e39-b08d-4c27-a689-6401b299e123" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Oct 13 18:18:37 crc kubenswrapper[4974]: I1013 18:18:37.825846 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6539e39-b08d-4c27-a689-6401b299e123" path="/var/lib/kubelet/pods/b6539e39-b08d-4c27-a689-6401b299e123/volumes" Oct 13 18:18:38 crc kubenswrapper[4974]: I1013 18:18:38.116170 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" event={"ID":"88ff76ee-886e-4097-a180-ba26b79bb59c","Type":"ContainerStarted","Data":"89418411bafdc4e2c200360f8fbff365cdf8a73fd598f59af0fa6b62aece7321"} Oct 13 18:18:38 crc kubenswrapper[4974]: I1013 18:18:38.116254 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" event={"ID":"88ff76ee-886e-4097-a180-ba26b79bb59c","Type":"ContainerStarted","Data":"67ff062924c6dc9d3c8a88603a3fda719ae4b36296afa92f277d7d5c97c898a0"} Oct 13 18:18:38 crc kubenswrapper[4974]: I1013 18:18:38.116363 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:38 crc kubenswrapper[4974]: I1013 18:18:38.132609 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" Oct 13 18:18:38 crc kubenswrapper[4974]: I1013 18:18:38.159065 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-657494565c-sw4dj" podStartSLOduration=28.159032926 podStartE2EDuration="28.159032926s" podCreationTimestamp="2025-10-13 18:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:18:38.153160682 +0000 UTC m=+253.057526852" watchObservedRunningTime="2025-10-13 18:18:38.159032926 +0000 UTC m=+253.063399046" Oct 13 18:20:37 crc kubenswrapper[4974]: I1013 18:20:37.743289 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:20:37 crc kubenswrapper[4974]: I1013 18:20:37.744046 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:21:07 crc kubenswrapper[4974]: I1013 18:21:07.742802 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:21:07 crc kubenswrapper[4974]: I1013 18:21:07.743426 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:21:37 crc kubenswrapper[4974]: I1013 18:21:37.743601 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:21:37 crc kubenswrapper[4974]: I1013 18:21:37.744394 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:21:37 crc kubenswrapper[4974]: I1013 18:21:37.746605 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:21:37 crc kubenswrapper[4974]: I1013 18:21:37.747363 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29dc800fef44c866f1bfd3f3e803022d871dfc0b9f20d8a971dfc51b409fc8b0"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 18:21:37 crc kubenswrapper[4974]: I1013 18:21:37.747481 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://29dc800fef44c866f1bfd3f3e803022d871dfc0b9f20d8a971dfc51b409fc8b0" gracePeriod=600 Oct 13 18:21:38 crc kubenswrapper[4974]: I1013 18:21:38.337595 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="29dc800fef44c866f1bfd3f3e803022d871dfc0b9f20d8a971dfc51b409fc8b0" exitCode=0 Oct 13 18:21:38 crc kubenswrapper[4974]: I1013 18:21:38.337818 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"29dc800fef44c866f1bfd3f3e803022d871dfc0b9f20d8a971dfc51b409fc8b0"} Oct 13 18:21:38 crc kubenswrapper[4974]: I1013 18:21:38.337977 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"8894b6a4641af63269e4147cdef54f2c3c1eada1368d28e6c504cfd79085b430"} Oct 13 18:21:38 crc kubenswrapper[4974]: I1013 18:21:38.338005 4974 scope.go:117] "RemoveContainer" containerID="5abfd4d67c5136fe10dbe36338a410d1051da7aa7fd3e9f14a539c3923d4b1c3" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.750241 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-g2nml"] Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.751644 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.763902 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-g2nml"] Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.794442 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbbbbc84-c93a-4179-928c-381388790a23-trusted-ca\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.794839 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.794889 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fbbbbc84-c93a-4179-928c-381388790a23-installation-pull-secrets\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.794915 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss45x\" (UniqueName: \"kubernetes.io/projected/fbbbbc84-c93a-4179-928c-381388790a23-kube-api-access-ss45x\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.794956 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fbbbbc84-c93a-4179-928c-381388790a23-ca-trust-extracted\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.795005 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fbbbbc84-c93a-4179-928c-381388790a23-registry-certificates\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.795074 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbbbbc84-c93a-4179-928c-381388790a23-bound-sa-token\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.795132 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbbbbc84-c93a-4179-928c-381388790a23-registry-tls\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.821504 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.896574 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fbbbbc84-c93a-4179-928c-381388790a23-ca-trust-extracted\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.896838 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fbbbbc84-c93a-4179-928c-381388790a23-registry-certificates\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.896921 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbbbbc84-c93a-4179-928c-381388790a23-bound-sa-token\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.896993 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbbbbc84-c93a-4179-928c-381388790a23-registry-tls\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.897042 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/fbbbbc84-c93a-4179-928c-381388790a23-ca-trust-extracted\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.897110 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbbbbc84-c93a-4179-928c-381388790a23-trusted-ca\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.897189 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fbbbbc84-c93a-4179-928c-381388790a23-installation-pull-secrets\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.897259 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss45x\" (UniqueName: \"kubernetes.io/projected/fbbbbc84-c93a-4179-928c-381388790a23-kube-api-access-ss45x\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.898243 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/fbbbbc84-c93a-4179-928c-381388790a23-registry-certificates\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.898687 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fbbbbc84-c93a-4179-928c-381388790a23-trusted-ca\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.903832 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/fbbbbc84-c93a-4179-928c-381388790a23-installation-pull-secrets\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.903849 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/fbbbbc84-c93a-4179-928c-381388790a23-registry-tls\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.914360 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbbbbc84-c93a-4179-928c-381388790a23-bound-sa-token\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:28 crc kubenswrapper[4974]: I1013 18:22:28.922178 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss45x\" (UniqueName: \"kubernetes.io/projected/fbbbbc84-c93a-4179-928c-381388790a23-kube-api-access-ss45x\") pod \"image-registry-66df7c8f76-g2nml\" (UID: \"fbbbbc84-c93a-4179-928c-381388790a23\") " pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:29 crc kubenswrapper[4974]: I1013 18:22:29.073855 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:29 crc kubenswrapper[4974]: I1013 18:22:29.349865 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-g2nml"] Oct 13 18:22:29 crc kubenswrapper[4974]: W1013 18:22:29.365364 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbbbbc84_c93a_4179_928c_381388790a23.slice/crio-810934ed26a392a2e4008037918c99eaeadf5537b9dcf59e55fe9b6644a1bdb9 WatchSource:0}: Error finding container 810934ed26a392a2e4008037918c99eaeadf5537b9dcf59e55fe9b6644a1bdb9: Status 404 returned error can't find the container with id 810934ed26a392a2e4008037918c99eaeadf5537b9dcf59e55fe9b6644a1bdb9 Oct 13 18:22:29 crc kubenswrapper[4974]: I1013 18:22:29.715765 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" event={"ID":"fbbbbc84-c93a-4179-928c-381388790a23","Type":"ContainerStarted","Data":"01a65e30e876af957da7762ea228f5857469b0b548e27f505cc12f8da1122d7b"} Oct 13 18:22:29 crc kubenswrapper[4974]: I1013 18:22:29.715827 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" event={"ID":"fbbbbc84-c93a-4179-928c-381388790a23","Type":"ContainerStarted","Data":"810934ed26a392a2e4008037918c99eaeadf5537b9dcf59e55fe9b6644a1bdb9"} Oct 13 18:22:29 crc kubenswrapper[4974]: I1013 18:22:29.716865 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:29 crc kubenswrapper[4974]: I1013 18:22:29.758263 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" podStartSLOduration=1.7582394959999998 podStartE2EDuration="1.758239496s" podCreationTimestamp="2025-10-13 18:22:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:22:29.753538362 +0000 UTC m=+484.657904492" watchObservedRunningTime="2025-10-13 18:22:29.758239496 +0000 UTC m=+484.662605606" Oct 13 18:22:49 crc kubenswrapper[4974]: I1013 18:22:49.082016 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-g2nml" Oct 13 18:22:49 crc kubenswrapper[4974]: I1013 18:22:49.159003 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nlffn"] Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.200761 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" podUID="40c4254f-5c3d-4655-82f9-49fd9510339a" containerName="registry" containerID="cri-o://9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4" gracePeriod=30 Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.631246 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.691685 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-registry-tls\") pod \"40c4254f-5c3d-4655-82f9-49fd9510339a\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.691776 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40c4254f-5c3d-4655-82f9-49fd9510339a-trusted-ca\") pod \"40c4254f-5c3d-4655-82f9-49fd9510339a\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.691821 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-bound-sa-token\") pod \"40c4254f-5c3d-4655-82f9-49fd9510339a\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.691862 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40c4254f-5c3d-4655-82f9-49fd9510339a-ca-trust-extracted\") pod \"40c4254f-5c3d-4655-82f9-49fd9510339a\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.692141 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"40c4254f-5c3d-4655-82f9-49fd9510339a\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.692186 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40c4254f-5c3d-4655-82f9-49fd9510339a-registry-certificates\") pod \"40c4254f-5c3d-4655-82f9-49fd9510339a\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.692241 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6485l\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-kube-api-access-6485l\") pod \"40c4254f-5c3d-4655-82f9-49fd9510339a\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.692281 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40c4254f-5c3d-4655-82f9-49fd9510339a-installation-pull-secrets\") pod \"40c4254f-5c3d-4655-82f9-49fd9510339a\" (UID: \"40c4254f-5c3d-4655-82f9-49fd9510339a\") " Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.693204 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c4254f-5c3d-4655-82f9-49fd9510339a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "40c4254f-5c3d-4655-82f9-49fd9510339a" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.693274 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40c4254f-5c3d-4655-82f9-49fd9510339a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "40c4254f-5c3d-4655-82f9-49fd9510339a" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.702568 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-kube-api-access-6485l" (OuterVolumeSpecName: "kube-api-access-6485l") pod "40c4254f-5c3d-4655-82f9-49fd9510339a" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a"). InnerVolumeSpecName "kube-api-access-6485l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.702960 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40c4254f-5c3d-4655-82f9-49fd9510339a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "40c4254f-5c3d-4655-82f9-49fd9510339a" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.705301 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "40c4254f-5c3d-4655-82f9-49fd9510339a" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.707717 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "40c4254f-5c3d-4655-82f9-49fd9510339a" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.714234 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "40c4254f-5c3d-4655-82f9-49fd9510339a" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.717490 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c4254f-5c3d-4655-82f9-49fd9510339a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "40c4254f-5c3d-4655-82f9-49fd9510339a" (UID: "40c4254f-5c3d-4655-82f9-49fd9510339a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.793625 4974 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/40c4254f-5c3d-4655-82f9-49fd9510339a-registry-certificates\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.793698 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6485l\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-kube-api-access-6485l\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.793711 4974 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/40c4254f-5c3d-4655-82f9-49fd9510339a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.793723 4974 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-registry-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.793734 4974 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/40c4254f-5c3d-4655-82f9-49fd9510339a-trusted-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.793744 4974 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/40c4254f-5c3d-4655-82f9-49fd9510339a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:14 crc kubenswrapper[4974]: I1013 18:23:14.793755 4974 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/40c4254f-5c3d-4655-82f9-49fd9510339a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:15 crc kubenswrapper[4974]: I1013 18:23:15.064901 4974 generic.go:334] "Generic (PLEG): container finished" podID="40c4254f-5c3d-4655-82f9-49fd9510339a" containerID="9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4" exitCode=0 Oct 13 18:23:15 crc kubenswrapper[4974]: I1013 18:23:15.064982 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" event={"ID":"40c4254f-5c3d-4655-82f9-49fd9510339a","Type":"ContainerDied","Data":"9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4"} Oct 13 18:23:15 crc kubenswrapper[4974]: I1013 18:23:15.065007 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" Oct 13 18:23:15 crc kubenswrapper[4974]: I1013 18:23:15.065127 4974 scope.go:117] "RemoveContainer" containerID="9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4" Oct 13 18:23:15 crc kubenswrapper[4974]: I1013 18:23:15.065103 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-nlffn" event={"ID":"40c4254f-5c3d-4655-82f9-49fd9510339a","Type":"ContainerDied","Data":"4692638061a09b175671fb6061fcdaddee4e6c2ca10502b852f83d6d1b33b67c"} Oct 13 18:23:15 crc kubenswrapper[4974]: I1013 18:23:15.098789 4974 scope.go:117] "RemoveContainer" containerID="9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4" Oct 13 18:23:15 crc kubenswrapper[4974]: E1013 18:23:15.099280 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4\": container with ID starting with 9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4 not found: ID does not exist" containerID="9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4" Oct 13 18:23:15 crc kubenswrapper[4974]: I1013 18:23:15.099324 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4"} err="failed to get container status \"9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4\": rpc error: code = NotFound desc = could not find container \"9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4\": container with ID starting with 9f88371314529464b31db0f71509b3e6235b52a8bd44e60e256105fde4cd6fe4 not found: ID does not exist" Oct 13 18:23:15 crc kubenswrapper[4974]: I1013 18:23:15.115922 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nlffn"] Oct 13 18:23:15 crc kubenswrapper[4974]: I1013 18:23:15.118363 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-nlffn"] Oct 13 18:23:15 crc kubenswrapper[4974]: I1013 18:23:15.823260 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40c4254f-5c3d-4655-82f9-49fd9510339a" path="/var/lib/kubelet/pods/40c4254f-5c3d-4655-82f9-49fd9510339a/volumes" Oct 13 18:23:26 crc kubenswrapper[4974]: I1013 18:23:26.907615 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-pbvz7"] Oct 13 18:23:26 crc kubenswrapper[4974]: E1013 18:23:26.908354 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c4254f-5c3d-4655-82f9-49fd9510339a" containerName="registry" Oct 13 18:23:26 crc kubenswrapper[4974]: I1013 18:23:26.908370 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c4254f-5c3d-4655-82f9-49fd9510339a" containerName="registry" Oct 13 18:23:26 crc kubenswrapper[4974]: I1013 18:23:26.908504 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c4254f-5c3d-4655-82f9-49fd9510339a" containerName="registry" Oct 13 18:23:26 crc kubenswrapper[4974]: I1013 18:23:26.908885 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-pbvz7" Oct 13 18:23:26 crc kubenswrapper[4974]: I1013 18:23:26.912797 4974 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-5hqcz" Oct 13 18:23:26 crc kubenswrapper[4974]: I1013 18:23:26.913418 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Oct 13 18:23:26 crc kubenswrapper[4974]: I1013 18:23:26.913465 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Oct 13 18:23:26 crc kubenswrapper[4974]: I1013 18:23:26.932164 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-79kfk"] Oct 13 18:23:26 crc kubenswrapper[4974]: I1013 18:23:26.932934 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-79kfk" Oct 13 18:23:26 crc kubenswrapper[4974]: I1013 18:23:26.948753 4974 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-fq88g" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.011771 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cwpp\" (UniqueName: \"kubernetes.io/projected/70f7d9f3-cbb4-4009-9feb-89a4eb2bbf95-kube-api-access-8cwpp\") pod \"cert-manager-cainjector-7f985d654d-pbvz7\" (UID: \"70f7d9f3-cbb4-4009-9feb-89a4eb2bbf95\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-pbvz7" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.011832 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdzgc\" (UniqueName: \"kubernetes.io/projected/37764aab-fdaf-4d54-8afc-f2788411ff07-kube-api-access-mdzgc\") pod \"cert-manager-5b446d88c5-79kfk\" (UID: \"37764aab-fdaf-4d54-8afc-f2788411ff07\") " pod="cert-manager/cert-manager-5b446d88c5-79kfk" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.011838 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-pbvz7"] Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.014681 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lhmgd"] Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.015272 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-lhmgd" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.017848 4974 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-xzn7k" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.023316 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-79kfk"] Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.025308 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lhmgd"] Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.113372 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcthk\" (UniqueName: \"kubernetes.io/projected/88a02d7c-89e6-464c-b519-aeb3fe4dfda3-kube-api-access-jcthk\") pod \"cert-manager-webhook-5655c58dd6-lhmgd\" (UID: \"88a02d7c-89e6-464c-b519-aeb3fe4dfda3\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lhmgd" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.113686 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cwpp\" (UniqueName: \"kubernetes.io/projected/70f7d9f3-cbb4-4009-9feb-89a4eb2bbf95-kube-api-access-8cwpp\") pod \"cert-manager-cainjector-7f985d654d-pbvz7\" (UID: \"70f7d9f3-cbb4-4009-9feb-89a4eb2bbf95\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-pbvz7" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.113905 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdzgc\" (UniqueName: \"kubernetes.io/projected/37764aab-fdaf-4d54-8afc-f2788411ff07-kube-api-access-mdzgc\") pod \"cert-manager-5b446d88c5-79kfk\" (UID: \"37764aab-fdaf-4d54-8afc-f2788411ff07\") " pod="cert-manager/cert-manager-5b446d88c5-79kfk" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.132190 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cwpp\" (UniqueName: \"kubernetes.io/projected/70f7d9f3-cbb4-4009-9feb-89a4eb2bbf95-kube-api-access-8cwpp\") pod \"cert-manager-cainjector-7f985d654d-pbvz7\" (UID: \"70f7d9f3-cbb4-4009-9feb-89a4eb2bbf95\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-pbvz7" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.135177 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdzgc\" (UniqueName: \"kubernetes.io/projected/37764aab-fdaf-4d54-8afc-f2788411ff07-kube-api-access-mdzgc\") pod \"cert-manager-5b446d88c5-79kfk\" (UID: \"37764aab-fdaf-4d54-8afc-f2788411ff07\") " pod="cert-manager/cert-manager-5b446d88c5-79kfk" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.214916 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcthk\" (UniqueName: \"kubernetes.io/projected/88a02d7c-89e6-464c-b519-aeb3fe4dfda3-kube-api-access-jcthk\") pod \"cert-manager-webhook-5655c58dd6-lhmgd\" (UID: \"88a02d7c-89e6-464c-b519-aeb3fe4dfda3\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lhmgd" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.224960 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-pbvz7" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.237239 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcthk\" (UniqueName: \"kubernetes.io/projected/88a02d7c-89e6-464c-b519-aeb3fe4dfda3-kube-api-access-jcthk\") pod \"cert-manager-webhook-5655c58dd6-lhmgd\" (UID: \"88a02d7c-89e6-464c-b519-aeb3fe4dfda3\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-lhmgd" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.249464 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-79kfk" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.326436 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-lhmgd" Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.520122 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-79kfk"] Oct 13 18:23:27 crc kubenswrapper[4974]: W1013 18:23:27.529783 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37764aab_fdaf_4d54_8afc_f2788411ff07.slice/crio-3e7420515630048568928160dc83444f5c6bd9bafe9f6874e4c2429e1bb21d0c WatchSource:0}: Error finding container 3e7420515630048568928160dc83444f5c6bd9bafe9f6874e4c2429e1bb21d0c: Status 404 returned error can't find the container with id 3e7420515630048568928160dc83444f5c6bd9bafe9f6874e4c2429e1bb21d0c Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.534305 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.567072 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-pbvz7"] Oct 13 18:23:27 crc kubenswrapper[4974]: W1013 18:23:27.575838 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70f7d9f3_cbb4_4009_9feb_89a4eb2bbf95.slice/crio-87afb2cae6643077ec2ed216ae408ce6f1ae640d75f3e1efe791f09536d0aac9 WatchSource:0}: Error finding container 87afb2cae6643077ec2ed216ae408ce6f1ae640d75f3e1efe791f09536d0aac9: Status 404 returned error can't find the container with id 87afb2cae6643077ec2ed216ae408ce6f1ae640d75f3e1efe791f09536d0aac9 Oct 13 18:23:27 crc kubenswrapper[4974]: I1013 18:23:27.604567 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-lhmgd"] Oct 13 18:23:28 crc kubenswrapper[4974]: I1013 18:23:28.160460 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-pbvz7" event={"ID":"70f7d9f3-cbb4-4009-9feb-89a4eb2bbf95","Type":"ContainerStarted","Data":"87afb2cae6643077ec2ed216ae408ce6f1ae640d75f3e1efe791f09536d0aac9"} Oct 13 18:23:28 crc kubenswrapper[4974]: I1013 18:23:28.172260 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-79kfk" event={"ID":"37764aab-fdaf-4d54-8afc-f2788411ff07","Type":"ContainerStarted","Data":"3e7420515630048568928160dc83444f5c6bd9bafe9f6874e4c2429e1bb21d0c"} Oct 13 18:23:28 crc kubenswrapper[4974]: I1013 18:23:28.175592 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-lhmgd" event={"ID":"88a02d7c-89e6-464c-b519-aeb3fe4dfda3","Type":"ContainerStarted","Data":"90164753724da5bd1ce53d7bd3687a4156fe2427d946da83f620416a5000e0b7"} Oct 13 18:23:32 crc kubenswrapper[4974]: I1013 18:23:32.221056 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-pbvz7" event={"ID":"70f7d9f3-cbb4-4009-9feb-89a4eb2bbf95","Type":"ContainerStarted","Data":"e67597d3ad98d7b56b2bbffc550af527cbc2263a6fa8c441b1979430c7724d7f"} Oct 13 18:23:32 crc kubenswrapper[4974]: I1013 18:23:32.223528 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-79kfk" event={"ID":"37764aab-fdaf-4d54-8afc-f2788411ff07","Type":"ContainerStarted","Data":"8e99ac7a782dbc7d975be1c039e0fd8e7c3bcc90e096ea541ce3a52aee189c0a"} Oct 13 18:23:32 crc kubenswrapper[4974]: I1013 18:23:32.225786 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-lhmgd" event={"ID":"88a02d7c-89e6-464c-b519-aeb3fe4dfda3","Type":"ContainerStarted","Data":"202a3fcb39597cc70e87449abc3588f512a1c12f4f286025883b0543d3608176"} Oct 13 18:23:32 crc kubenswrapper[4974]: I1013 18:23:32.226021 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-lhmgd" Oct 13 18:23:32 crc kubenswrapper[4974]: I1013 18:23:32.251110 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-pbvz7" podStartSLOduration=2.895109658 podStartE2EDuration="6.25108233s" podCreationTimestamp="2025-10-13 18:23:26 +0000 UTC" firstStartedPulling="2025-10-13 18:23:27.578593472 +0000 UTC m=+542.482959562" lastFinishedPulling="2025-10-13 18:23:30.934566144 +0000 UTC m=+545.838932234" observedRunningTime="2025-10-13 18:23:32.246938939 +0000 UTC m=+547.151305069" watchObservedRunningTime="2025-10-13 18:23:32.25108233 +0000 UTC m=+547.155448450" Oct 13 18:23:32 crc kubenswrapper[4974]: I1013 18:23:32.267969 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-lhmgd" podStartSLOduration=2.874749981 podStartE2EDuration="6.267932563s" podCreationTimestamp="2025-10-13 18:23:26 +0000 UTC" firstStartedPulling="2025-10-13 18:23:27.618557631 +0000 UTC m=+542.522923711" lastFinishedPulling="2025-10-13 18:23:31.011740183 +0000 UTC m=+545.916106293" observedRunningTime="2025-10-13 18:23:32.265901324 +0000 UTC m=+547.170267444" watchObservedRunningTime="2025-10-13 18:23:32.267932563 +0000 UTC m=+547.172298693" Oct 13 18:23:32 crc kubenswrapper[4974]: I1013 18:23:32.289752 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-79kfk" podStartSLOduration=2.817520566 podStartE2EDuration="6.289727491s" podCreationTimestamp="2025-10-13 18:23:26 +0000 UTC" firstStartedPulling="2025-10-13 18:23:27.534070498 +0000 UTC m=+542.438436578" lastFinishedPulling="2025-10-13 18:23:31.006277413 +0000 UTC m=+545.910643503" observedRunningTime="2025-10-13 18:23:32.288357121 +0000 UTC m=+547.192723251" watchObservedRunningTime="2025-10-13 18:23:32.289727491 +0000 UTC m=+547.194093601" Oct 13 18:23:37 crc kubenswrapper[4974]: I1013 18:23:37.330000 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-lhmgd" Oct 13 18:23:37 crc kubenswrapper[4974]: I1013 18:23:37.487526 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zwcs8"] Oct 13 18:23:37 crc kubenswrapper[4974]: I1013 18:23:37.487969 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovn-controller" containerID="cri-o://ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e" gracePeriod=30 Oct 13 18:23:37 crc kubenswrapper[4974]: I1013 18:23:37.488067 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="nbdb" containerID="cri-o://1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941" gracePeriod=30 Oct 13 18:23:37 crc kubenswrapper[4974]: I1013 18:23:37.488180 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovn-acl-logging" containerID="cri-o://2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f" gracePeriod=30 Oct 13 18:23:37 crc kubenswrapper[4974]: I1013 18:23:37.488208 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="sbdb" containerID="cri-o://f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd" gracePeriod=30 Oct 13 18:23:37 crc kubenswrapper[4974]: I1013 18:23:37.488099 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="kube-rbac-proxy-node" containerID="cri-o://f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b" gracePeriod=30 Oct 13 18:23:37 crc kubenswrapper[4974]: I1013 18:23:37.488410 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="northd" containerID="cri-o://f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73" gracePeriod=30 Oct 13 18:23:37 crc kubenswrapper[4974]: I1013 18:23:37.488486 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f" gracePeriod=30 Oct 13 18:23:37 crc kubenswrapper[4974]: I1013 18:23:37.524209 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" containerID="cri-o://69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8" gracePeriod=30 Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.252926 4974 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941 is running failed: container process not found" containerID="1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.254197 4974 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941 is running failed: container process not found" containerID="1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.255365 4974 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941 is running failed: container process not found" containerID="1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.255442 4974 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941 is running failed: container process not found" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="nbdb" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.257841 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/3.log" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.261794 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovn-acl-logging/0.log" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.263734 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovn-controller/0.log" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.264736 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.277917 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovnkube-controller/3.log" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.282402 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovn-acl-logging/0.log" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.283459 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-zwcs8_d9f54cc7-5b3b-4481-9be5-f03df1854435/ovn-controller/0.log" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284215 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8" exitCode=0 Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284261 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd" exitCode=0 Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284277 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941" exitCode=0 Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284306 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73" exitCode=0 Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284322 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f" exitCode=0 Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284334 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b" exitCode=0 Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284350 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f" exitCode=143 Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284366 4974 generic.go:334] "Generic (PLEG): container finished" podID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerID="ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e" exitCode=143 Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284386 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284261 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284461 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284494 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284523 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284550 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284562 4974 scope.go:117] "RemoveContainer" containerID="69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284578 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284708 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284731 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284745 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284760 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284774 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284789 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284803 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284820 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284832 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284849 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284873 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284888 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284902 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284918 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284934 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284948 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284963 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284978 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.284993 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285008 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285028 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285051 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285064 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285075 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285087 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285098 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285109 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285121 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285132 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285143 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285154 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285169 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zwcs8" event={"ID":"d9f54cc7-5b3b-4481-9be5-f03df1854435","Type":"ContainerDied","Data":"8dec09712400c9a59fc9f1746ad1c613e0ad3bf489988bf2ff3b84803b6a0e4d"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285187 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285199 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285210 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285222 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285233 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285243 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285254 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285265 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285275 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.285286 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.287814 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xcspx_9c38c0e3-9bee-402b-adf0-27ac9e31c0f0/kube-multus/2.log" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.289104 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xcspx_9c38c0e3-9bee-402b-adf0-27ac9e31c0f0/kube-multus/1.log" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.289178 4974 generic.go:334] "Generic (PLEG): container finished" podID="9c38c0e3-9bee-402b-adf0-27ac9e31c0f0" containerID="a51eb90e50915e1d2d4940bd5a7e011787bd8e414e7a1abfc7d70efc8c48343d" exitCode=2 Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.289220 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xcspx" event={"ID":"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0","Type":"ContainerDied","Data":"a51eb90e50915e1d2d4940bd5a7e011787bd8e414e7a1abfc7d70efc8c48343d"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.289255 4974 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94"} Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.290142 4974 scope.go:117] "RemoveContainer" containerID="a51eb90e50915e1d2d4940bd5a7e011787bd8e414e7a1abfc7d70efc8c48343d" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.290600 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xcspx_openshift-multus(9c38c0e3-9bee-402b-adf0-27ac9e31c0f0)\"" pod="openshift-multus/multus-xcspx" podUID="9c38c0e3-9bee-402b-adf0-27ac9e31c0f0" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.335769 4974 scope.go:117] "RemoveContainer" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.364128 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-9pvm6"] Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.364608 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="sbdb" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.364649 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="sbdb" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.364705 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovn-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.364725 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovn-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.364752 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.364772 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.364798 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.364818 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.364840 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovn-acl-logging" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.364859 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovn-acl-logging" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.364887 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="northd" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.364905 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="northd" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.364931 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="kube-rbac-proxy-ovn-metrics" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.364948 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="kube-rbac-proxy-ovn-metrics" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.364967 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="kubecfg-setup" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.364984 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="kubecfg-setup" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.365009 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365026 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.365045 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="kube-rbac-proxy-node" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365063 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="kube-rbac-proxy-node" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.365094 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="nbdb" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365113 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="nbdb" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365409 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovn-acl-logging" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365439 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365459 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365485 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365508 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365526 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovn-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365551 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="kube-rbac-proxy-ovn-metrics" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365582 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="kube-rbac-proxy-node" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365610 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="nbdb" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365629 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="northd" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.365688 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="sbdb" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.365986 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.366012 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.366279 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.366547 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.366572 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" containerName="ovnkube-controller" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371325 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371401 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-env-overrides\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371452 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-run-netns\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371472 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovnkube-config\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371491 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-slash\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371509 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-run-ovn-kubernetes\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371537 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8nch\" (UniqueName: \"kubernetes.io/projected/d9f54cc7-5b3b-4481-9be5-f03df1854435-kube-api-access-s8nch\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371565 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-cni-netd\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371617 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovn-node-metrics-cert\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371690 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371729 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovnkube-script-lib\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371761 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-log-socket\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371797 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-openvswitch\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371822 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-node-log\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371851 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-ovn\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371881 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-systemd-units\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371900 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-systemd\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371926 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-cni-bin\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371945 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-kubelet\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371963 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-etc-openvswitch\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.371997 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-var-lib-openvswitch\") pod \"d9f54cc7-5b3b-4481-9be5-f03df1854435\" (UID: \"d9f54cc7-5b3b-4481-9be5-f03df1854435\") " Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.372174 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-node-log" (OuterVolumeSpecName: "node-log") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.372234 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.372246 4974 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-node-log\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.372284 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-slash" (OuterVolumeSpecName: "host-slash") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.372596 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.372643 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.372711 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.372719 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-log-socket" (OuterVolumeSpecName: "log-socket") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.372866 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.372978 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.373029 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.373046 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.373068 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.373082 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.373111 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.373429 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.373452 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.373985 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.386271 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9f54cc7-5b3b-4481-9be5-f03df1854435-kube-api-access-s8nch" (OuterVolumeSpecName: "kube-api-access-s8nch") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "kube-api-access-s8nch". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.387262 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.388293 4974 scope.go:117] "RemoveContainer" containerID="f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.401126 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d9f54cc7-5b3b-4481-9be5-f03df1854435" (UID: "d9f54cc7-5b3b-4481-9be5-f03df1854435"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.414784 4974 scope.go:117] "RemoveContainer" containerID="1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.433554 4974 scope.go:117] "RemoveContainer" containerID="f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.447522 4974 scope.go:117] "RemoveContainer" containerID="a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.464549 4974 scope.go:117] "RemoveContainer" containerID="f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473628 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-cni-netd\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473676 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-cni-bin\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473696 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a00c1900-bb71-4fbc-bb83-17281f2793ca-env-overrides\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473713 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a00c1900-bb71-4fbc-bb83-17281f2793ca-ovn-node-metrics-cert\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473737 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-run-ovn\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473755 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-node-log\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473769 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-var-lib-openvswitch\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473785 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-kubelet\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473798 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-slash\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473813 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a00c1900-bb71-4fbc-bb83-17281f2793ca-ovnkube-config\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473830 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-systemd-units\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473849 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wgnx\" (UniqueName: \"kubernetes.io/projected/a00c1900-bb71-4fbc-bb83-17281f2793ca-kube-api-access-5wgnx\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473865 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-etc-openvswitch\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473886 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a00c1900-bb71-4fbc-bb83-17281f2793ca-ovnkube-script-lib\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473908 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473928 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-log-socket\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473947 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-run-openvswitch\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473961 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-run-netns\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.473992 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-run-systemd\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474007 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-run-ovn-kubernetes\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474045 4974 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-systemd-units\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474055 4974 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-systemd\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474065 4974 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-cni-bin\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474076 4974 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-kubelet\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474084 4974 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474093 4974 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474101 4974 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-env-overrides\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474109 4974 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-run-netns\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474117 4974 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovnkube-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474125 4974 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-cni-netd\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474133 4974 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-slash\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474141 4974 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474152 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8nch\" (UniqueName: \"kubernetes.io/projected/d9f54cc7-5b3b-4481-9be5-f03df1854435-kube-api-access-s8nch\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474161 4974 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474172 4974 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474182 4974 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d9f54cc7-5b3b-4481-9be5-f03df1854435-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474190 4974 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-log-socket\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474198 4974 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-openvswitch\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.474206 4974 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d9f54cc7-5b3b-4481-9be5-f03df1854435-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.480049 4974 scope.go:117] "RemoveContainer" containerID="2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.494864 4974 scope.go:117] "RemoveContainer" containerID="ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.513009 4974 scope.go:117] "RemoveContainer" containerID="1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.528081 4974 scope.go:117] "RemoveContainer" containerID="69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.528419 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8\": container with ID starting with 69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8 not found: ID does not exist" containerID="69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.528710 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8"} err="failed to get container status \"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8\": rpc error: code = NotFound desc = could not find container \"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8\": container with ID starting with 69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.528932 4974 scope.go:117] "RemoveContainer" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.529397 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\": container with ID starting with 86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f not found: ID does not exist" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.529417 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f"} err="failed to get container status \"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\": rpc error: code = NotFound desc = could not find container \"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\": container with ID starting with 86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.529430 4974 scope.go:117] "RemoveContainer" containerID="f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.529751 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\": container with ID starting with f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd not found: ID does not exist" containerID="f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.529828 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd"} err="failed to get container status \"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\": rpc error: code = NotFound desc = could not find container \"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\": container with ID starting with f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.529881 4974 scope.go:117] "RemoveContainer" containerID="1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.530270 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\": container with ID starting with 1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941 not found: ID does not exist" containerID="1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.530297 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941"} err="failed to get container status \"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\": rpc error: code = NotFound desc = could not find container \"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\": container with ID starting with 1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.530313 4974 scope.go:117] "RemoveContainer" containerID="f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.530597 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\": container with ID starting with f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73 not found: ID does not exist" containerID="f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.530792 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73"} err="failed to get container status \"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\": rpc error: code = NotFound desc = could not find container \"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\": container with ID starting with f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.530929 4974 scope.go:117] "RemoveContainer" containerID="a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.531320 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\": container with ID starting with a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f not found: ID does not exist" containerID="a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.531337 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f"} err="failed to get container status \"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\": rpc error: code = NotFound desc = could not find container \"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\": container with ID starting with a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.531349 4974 scope.go:117] "RemoveContainer" containerID="f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.531679 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\": container with ID starting with f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b not found: ID does not exist" containerID="f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.531699 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b"} err="failed to get container status \"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\": rpc error: code = NotFound desc = could not find container \"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\": container with ID starting with f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.531712 4974 scope.go:117] "RemoveContainer" containerID="2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.532028 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\": container with ID starting with 2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f not found: ID does not exist" containerID="2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.532167 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f"} err="failed to get container status \"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\": rpc error: code = NotFound desc = could not find container \"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\": container with ID starting with 2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.532296 4974 scope.go:117] "RemoveContainer" containerID="ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.532741 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\": container with ID starting with ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e not found: ID does not exist" containerID="ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.532759 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e"} err="failed to get container status \"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\": rpc error: code = NotFound desc = could not find container \"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\": container with ID starting with ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.532771 4974 scope.go:117] "RemoveContainer" containerID="1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7" Oct 13 18:23:38 crc kubenswrapper[4974]: E1013 18:23:38.533203 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\": container with ID starting with 1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7 not found: ID does not exist" containerID="1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.533239 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7"} err="failed to get container status \"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\": rpc error: code = NotFound desc = could not find container \"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\": container with ID starting with 1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.533261 4974 scope.go:117] "RemoveContainer" containerID="69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.533585 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8"} err="failed to get container status \"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8\": rpc error: code = NotFound desc = could not find container \"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8\": container with ID starting with 69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.533760 4974 scope.go:117] "RemoveContainer" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.534126 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f"} err="failed to get container status \"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\": rpc error: code = NotFound desc = could not find container \"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\": container with ID starting with 86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.534158 4974 scope.go:117] "RemoveContainer" containerID="f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.534382 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd"} err="failed to get container status \"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\": rpc error: code = NotFound desc = could not find container \"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\": container with ID starting with f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.534406 4974 scope.go:117] "RemoveContainer" containerID="1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.534713 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941"} err="failed to get container status \"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\": rpc error: code = NotFound desc = could not find container \"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\": container with ID starting with 1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.534728 4974 scope.go:117] "RemoveContainer" containerID="f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.534946 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73"} err="failed to get container status \"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\": rpc error: code = NotFound desc = could not find container \"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\": container with ID starting with f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.534963 4974 scope.go:117] "RemoveContainer" containerID="a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.535290 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f"} err="failed to get container status \"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\": rpc error: code = NotFound desc = could not find container \"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\": container with ID starting with a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.535344 4974 scope.go:117] "RemoveContainer" containerID="f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.535681 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b"} err="failed to get container status \"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\": rpc error: code = NotFound desc = could not find container \"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\": container with ID starting with f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.535710 4974 scope.go:117] "RemoveContainer" containerID="2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.536121 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f"} err="failed to get container status \"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\": rpc error: code = NotFound desc = could not find container \"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\": container with ID starting with 2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.536172 4974 scope.go:117] "RemoveContainer" containerID="ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.536623 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e"} err="failed to get container status \"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\": rpc error: code = NotFound desc = could not find container \"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\": container with ID starting with ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.536641 4974 scope.go:117] "RemoveContainer" containerID="1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.536975 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7"} err="failed to get container status \"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\": rpc error: code = NotFound desc = could not find container \"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\": container with ID starting with 1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.536990 4974 scope.go:117] "RemoveContainer" containerID="69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.537412 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8"} err="failed to get container status \"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8\": rpc error: code = NotFound desc = could not find container \"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8\": container with ID starting with 69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.537569 4974 scope.go:117] "RemoveContainer" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.538048 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f"} err="failed to get container status \"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\": rpc error: code = NotFound desc = could not find container \"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\": container with ID starting with 86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.538064 4974 scope.go:117] "RemoveContainer" containerID="f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.538466 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd"} err="failed to get container status \"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\": rpc error: code = NotFound desc = could not find container \"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\": container with ID starting with f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.538671 4974 scope.go:117] "RemoveContainer" containerID="1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.539499 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941"} err="failed to get container status \"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\": rpc error: code = NotFound desc = could not find container \"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\": container with ID starting with 1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.539515 4974 scope.go:117] "RemoveContainer" containerID="f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.539824 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73"} err="failed to get container status \"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\": rpc error: code = NotFound desc = could not find container \"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\": container with ID starting with f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.539849 4974 scope.go:117] "RemoveContainer" containerID="a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.540147 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f"} err="failed to get container status \"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\": rpc error: code = NotFound desc = could not find container \"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\": container with ID starting with a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.540167 4974 scope.go:117] "RemoveContainer" containerID="f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.540435 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b"} err="failed to get container status \"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\": rpc error: code = NotFound desc = could not find container \"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\": container with ID starting with f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.540451 4974 scope.go:117] "RemoveContainer" containerID="2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.540910 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f"} err="failed to get container status \"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\": rpc error: code = NotFound desc = could not find container \"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\": container with ID starting with 2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.540937 4974 scope.go:117] "RemoveContainer" containerID="ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.541193 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e"} err="failed to get container status \"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\": rpc error: code = NotFound desc = could not find container \"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\": container with ID starting with ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.541220 4974 scope.go:117] "RemoveContainer" containerID="1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.541487 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7"} err="failed to get container status \"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\": rpc error: code = NotFound desc = could not find container \"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\": container with ID starting with 1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.541514 4974 scope.go:117] "RemoveContainer" containerID="69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.541881 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8"} err="failed to get container status \"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8\": rpc error: code = NotFound desc = could not find container \"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8\": container with ID starting with 69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.541897 4974 scope.go:117] "RemoveContainer" containerID="86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.542181 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f"} err="failed to get container status \"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\": rpc error: code = NotFound desc = could not find container \"86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f\": container with ID starting with 86b768271dd1f9bf0b9510965309f88efeea8694aa499e48a3111e155269472f not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.542338 4974 scope.go:117] "RemoveContainer" containerID="f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.542753 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd"} err="failed to get container status \"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\": rpc error: code = NotFound desc = could not find container \"f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd\": container with ID starting with f3e0a8d6013323ef00db9805b72ff98de82e2b09c0d1ffb61593f01d520f12fd not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.542769 4974 scope.go:117] "RemoveContainer" containerID="1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.543364 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941"} err="failed to get container status \"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\": rpc error: code = NotFound desc = could not find container \"1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941\": container with ID starting with 1226f1cbbeb9a1504b8ddd972a12e5dc987e25475b256dacbb27fc348ac23941 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.543395 4974 scope.go:117] "RemoveContainer" containerID="f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.543668 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73"} err="failed to get container status \"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\": rpc error: code = NotFound desc = could not find container \"f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73\": container with ID starting with f361ef3aded68a397c287f603522891411deed5aca9192d6ddb2c965c6180c73 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.543686 4974 scope.go:117] "RemoveContainer" containerID="a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.544214 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f"} err="failed to get container status \"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\": rpc error: code = NotFound desc = could not find container \"a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f\": container with ID starting with a8d384694c24bd6f0a50e5006509958f68cac04fc4a7dd70642677bc2a28d83f not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.544250 4974 scope.go:117] "RemoveContainer" containerID="f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.544484 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b"} err="failed to get container status \"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\": rpc error: code = NotFound desc = could not find container \"f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b\": container with ID starting with f8a9ca613fc203db4279a6d1dd5c5d2fcd6e64abfa0b66074530601915ee186b not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.544504 4974 scope.go:117] "RemoveContainer" containerID="2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.544922 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f"} err="failed to get container status \"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\": rpc error: code = NotFound desc = could not find container \"2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f\": container with ID starting with 2c26b0ea8732719fd6bfc333cab0b2ff1635ddc13444a090ce6b2ae86d892e0f not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.544956 4974 scope.go:117] "RemoveContainer" containerID="ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.545258 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e"} err="failed to get container status \"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\": rpc error: code = NotFound desc = could not find container \"ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e\": container with ID starting with ea7faebda0cbec17ee20dd5ed6398a80651c01390fd41f8fc071f7932b927f6e not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.545275 4974 scope.go:117] "RemoveContainer" containerID="1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.545588 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7"} err="failed to get container status \"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\": rpc error: code = NotFound desc = could not find container \"1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7\": container with ID starting with 1e2f9efbb27ffab44f1a4dad5c96d90cced801b4b31b4a941218d303ac18fce7 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.545793 4974 scope.go:117] "RemoveContainer" containerID="69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.546422 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8"} err="failed to get container status \"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8\": rpc error: code = NotFound desc = could not find container \"69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8\": container with ID starting with 69da4d650e96064e74215e97715b16c7f87a0b4fbd87c9912664fc582c49fcf8 not found: ID does not exist" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.575618 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-cni-bin\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.575732 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-cni-netd\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.575776 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a00c1900-bb71-4fbc-bb83-17281f2793ca-env-overrides\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.575810 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a00c1900-bb71-4fbc-bb83-17281f2793ca-ovn-node-metrics-cert\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.575852 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-run-ovn\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.575881 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-node-log\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.575910 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-var-lib-openvswitch\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.575947 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-kubelet\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.575977 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-slash\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576005 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a00c1900-bb71-4fbc-bb83-17281f2793ca-ovnkube-config\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576034 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-systemd-units\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576065 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wgnx\" (UniqueName: \"kubernetes.io/projected/a00c1900-bb71-4fbc-bb83-17281f2793ca-kube-api-access-5wgnx\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576100 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-etc-openvswitch\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576144 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a00c1900-bb71-4fbc-bb83-17281f2793ca-ovnkube-script-lib\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576189 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576225 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-log-socket\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576262 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-run-openvswitch\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576295 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-run-netns\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576363 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-run-systemd\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576395 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-run-ovn-kubernetes\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576547 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-run-ovn-kubernetes\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.576615 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-cni-netd\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.577240 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-cni-bin\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.577439 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-systemd-units\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.577553 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a00c1900-bb71-4fbc-bb83-17281f2793ca-env-overrides\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.578063 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-etc-openvswitch\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.578672 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-run-openvswitch\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.578680 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-run-ovn\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.578741 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-kubelet\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.578751 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-run-netns\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.578772 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-node-log\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.578796 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-slash\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.578802 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-var-lib-openvswitch\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.578832 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-run-systemd\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.578824 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.578918 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a00c1900-bb71-4fbc-bb83-17281f2793ca-log-socket\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.579510 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a00c1900-bb71-4fbc-bb83-17281f2793ca-ovnkube-script-lib\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.579970 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a00c1900-bb71-4fbc-bb83-17281f2793ca-ovnkube-config\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.585304 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a00c1900-bb71-4fbc-bb83-17281f2793ca-ovn-node-metrics-cert\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.610085 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wgnx\" (UniqueName: \"kubernetes.io/projected/a00c1900-bb71-4fbc-bb83-17281f2793ca-kube-api-access-5wgnx\") pod \"ovnkube-node-9pvm6\" (UID: \"a00c1900-bb71-4fbc-bb83-17281f2793ca\") " pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.689883 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zwcs8"] Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.694481 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:38 crc kubenswrapper[4974]: I1013 18:23:38.697906 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zwcs8"] Oct 13 18:23:39 crc kubenswrapper[4974]: I1013 18:23:39.309361 4974 generic.go:334] "Generic (PLEG): container finished" podID="a00c1900-bb71-4fbc-bb83-17281f2793ca" containerID="385c9a123b3001b02f3fad6c4f7787a07894deb3b82d8c24455111655c85c0e2" exitCode=0 Oct 13 18:23:39 crc kubenswrapper[4974]: I1013 18:23:39.309511 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" event={"ID":"a00c1900-bb71-4fbc-bb83-17281f2793ca","Type":"ContainerDied","Data":"385c9a123b3001b02f3fad6c4f7787a07894deb3b82d8c24455111655c85c0e2"} Oct 13 18:23:39 crc kubenswrapper[4974]: I1013 18:23:39.309709 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" event={"ID":"a00c1900-bb71-4fbc-bb83-17281f2793ca","Type":"ContainerStarted","Data":"ec887390a10ad99e5c69f6efe83798cc6b800983598fb9de76a097492ade21e0"} Oct 13 18:23:39 crc kubenswrapper[4974]: I1013 18:23:39.823861 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9f54cc7-5b3b-4481-9be5-f03df1854435" path="/var/lib/kubelet/pods/d9f54cc7-5b3b-4481-9be5-f03df1854435/volumes" Oct 13 18:23:40 crc kubenswrapper[4974]: I1013 18:23:40.320479 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" event={"ID":"a00c1900-bb71-4fbc-bb83-17281f2793ca","Type":"ContainerStarted","Data":"80f7e904870cce17b7f69bae8bcb5cc885fe59291fcfd236457f1a5db9f73b4e"} Oct 13 18:23:40 crc kubenswrapper[4974]: I1013 18:23:40.321019 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" event={"ID":"a00c1900-bb71-4fbc-bb83-17281f2793ca","Type":"ContainerStarted","Data":"43e2d7400c5ff8efdfa64eb7b21a6cbcfa2f53a4815d961ff9ccac0baf5a549a"} Oct 13 18:23:40 crc kubenswrapper[4974]: I1013 18:23:40.321048 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" event={"ID":"a00c1900-bb71-4fbc-bb83-17281f2793ca","Type":"ContainerStarted","Data":"441db59d8bea5f5b9587e12e97e65b8cd28b203e9b82e686f849f59df8ce5e9a"} Oct 13 18:23:40 crc kubenswrapper[4974]: I1013 18:23:40.321068 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" event={"ID":"a00c1900-bb71-4fbc-bb83-17281f2793ca","Type":"ContainerStarted","Data":"028afea10c2e4f4c3d6a7e3d692cd41b4df073b760e46c9d5039677157465724"} Oct 13 18:23:40 crc kubenswrapper[4974]: I1013 18:23:40.321088 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" event={"ID":"a00c1900-bb71-4fbc-bb83-17281f2793ca","Type":"ContainerStarted","Data":"2c8b98254d33aa996e8456ffd2ad77cad9cc7bbf995a73b37f1ef1b1eb0615ae"} Oct 13 18:23:40 crc kubenswrapper[4974]: I1013 18:23:40.321107 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" event={"ID":"a00c1900-bb71-4fbc-bb83-17281f2793ca","Type":"ContainerStarted","Data":"c9a2416a1e8e54135808ca15103be7061e6999cf134a95186439751464624496"} Oct 13 18:23:43 crc kubenswrapper[4974]: I1013 18:23:43.346057 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" event={"ID":"a00c1900-bb71-4fbc-bb83-17281f2793ca","Type":"ContainerStarted","Data":"ffb1df9085d2f614479c3c4237e8e277a6e280ed5d84970f7b215b79d34c50fb"} Oct 13 18:23:45 crc kubenswrapper[4974]: I1013 18:23:45.360065 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" event={"ID":"a00c1900-bb71-4fbc-bb83-17281f2793ca","Type":"ContainerStarted","Data":"17ba1a5e7da4a88d09752b10cb35632ef7a8be11fd09341dafce53dc39eed91d"} Oct 13 18:23:45 crc kubenswrapper[4974]: I1013 18:23:45.360747 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:45 crc kubenswrapper[4974]: I1013 18:23:45.360841 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:45 crc kubenswrapper[4974]: I1013 18:23:45.360912 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:45 crc kubenswrapper[4974]: I1013 18:23:45.388848 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:45 crc kubenswrapper[4974]: I1013 18:23:45.389007 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:23:45 crc kubenswrapper[4974]: I1013 18:23:45.397403 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" podStartSLOduration=7.397389959 podStartE2EDuration="7.397389959s" podCreationTimestamp="2025-10-13 18:23:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:23:45.394623118 +0000 UTC m=+560.298989218" watchObservedRunningTime="2025-10-13 18:23:45.397389959 +0000 UTC m=+560.301756039" Oct 13 18:23:51 crc kubenswrapper[4974]: I1013 18:23:51.812255 4974 scope.go:117] "RemoveContainer" containerID="a51eb90e50915e1d2d4940bd5a7e011787bd8e414e7a1abfc7d70efc8c48343d" Oct 13 18:23:51 crc kubenswrapper[4974]: E1013 18:23:51.813545 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xcspx_openshift-multus(9c38c0e3-9bee-402b-adf0-27ac9e31c0f0)\"" pod="openshift-multus/multus-xcspx" podUID="9c38c0e3-9bee-402b-adf0-27ac9e31c0f0" Oct 13 18:24:02 crc kubenswrapper[4974]: I1013 18:24:02.811899 4974 scope.go:117] "RemoveContainer" containerID="a51eb90e50915e1d2d4940bd5a7e011787bd8e414e7a1abfc7d70efc8c48343d" Oct 13 18:24:03 crc kubenswrapper[4974]: I1013 18:24:03.492491 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xcspx_9c38c0e3-9bee-402b-adf0-27ac9e31c0f0/kube-multus/2.log" Oct 13 18:24:03 crc kubenswrapper[4974]: I1013 18:24:03.493574 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xcspx_9c38c0e3-9bee-402b-adf0-27ac9e31c0f0/kube-multus/1.log" Oct 13 18:24:03 crc kubenswrapper[4974]: I1013 18:24:03.493878 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xcspx" event={"ID":"9c38c0e3-9bee-402b-adf0-27ac9e31c0f0","Type":"ContainerStarted","Data":"f7fc45e43d3e8eaa76f7b01f40045c1d7dfcad84f665b30106309caea9dfdef4"} Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.579649 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs"] Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.581742 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.584931 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.625138 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9250202b-1093-4e24-a2c7-0c907f458986-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs\" (UID: \"9250202b-1093-4e24-a2c7-0c907f458986\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.625235 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9250202b-1093-4e24-a2c7-0c907f458986-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs\" (UID: \"9250202b-1093-4e24-a2c7-0c907f458986\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.625290 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24vpx\" (UniqueName: \"kubernetes.io/projected/9250202b-1093-4e24-a2c7-0c907f458986-kube-api-access-24vpx\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs\" (UID: \"9250202b-1093-4e24-a2c7-0c907f458986\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.655611 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs"] Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.726773 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9250202b-1093-4e24-a2c7-0c907f458986-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs\" (UID: \"9250202b-1093-4e24-a2c7-0c907f458986\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.726884 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9250202b-1093-4e24-a2c7-0c907f458986-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs\" (UID: \"9250202b-1093-4e24-a2c7-0c907f458986\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.726940 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24vpx\" (UniqueName: \"kubernetes.io/projected/9250202b-1093-4e24-a2c7-0c907f458986-kube-api-access-24vpx\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs\" (UID: \"9250202b-1093-4e24-a2c7-0c907f458986\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.727450 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9250202b-1093-4e24-a2c7-0c907f458986-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs\" (UID: \"9250202b-1093-4e24-a2c7-0c907f458986\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.727623 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9250202b-1093-4e24-a2c7-0c907f458986-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs\" (UID: \"9250202b-1093-4e24-a2c7-0c907f458986\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.759882 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24vpx\" (UniqueName: \"kubernetes.io/projected/9250202b-1093-4e24-a2c7-0c907f458986-kube-api-access-24vpx\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs\" (UID: \"9250202b-1093-4e24-a2c7-0c907f458986\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:06 crc kubenswrapper[4974]: I1013 18:24:06.917633 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:07 crc kubenswrapper[4974]: I1013 18:24:07.170135 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs"] Oct 13 18:24:07 crc kubenswrapper[4974]: I1013 18:24:07.519851 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" event={"ID":"9250202b-1093-4e24-a2c7-0c907f458986","Type":"ContainerStarted","Data":"c35773d0c4ab37053d42bb9b13905241516c5c99f4620a6ae95380eb23358629"} Oct 13 18:24:07 crc kubenswrapper[4974]: I1013 18:24:07.520237 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" event={"ID":"9250202b-1093-4e24-a2c7-0c907f458986","Type":"ContainerStarted","Data":"13c2697be651c03bc02aeb7d55aa6315c989b2a9ca6ddbc86d3f81acbddf6c56"} Oct 13 18:24:07 crc kubenswrapper[4974]: I1013 18:24:07.742565 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:24:07 crc kubenswrapper[4974]: I1013 18:24:07.742627 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:24:08 crc kubenswrapper[4974]: I1013 18:24:08.532920 4974 generic.go:334] "Generic (PLEG): container finished" podID="9250202b-1093-4e24-a2c7-0c907f458986" containerID="c35773d0c4ab37053d42bb9b13905241516c5c99f4620a6ae95380eb23358629" exitCode=0 Oct 13 18:24:08 crc kubenswrapper[4974]: I1013 18:24:08.532980 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" event={"ID":"9250202b-1093-4e24-a2c7-0c907f458986","Type":"ContainerDied","Data":"c35773d0c4ab37053d42bb9b13905241516c5c99f4620a6ae95380eb23358629"} Oct 13 18:24:08 crc kubenswrapper[4974]: I1013 18:24:08.719074 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9pvm6" Oct 13 18:24:11 crc kubenswrapper[4974]: I1013 18:24:11.554698 4974 generic.go:334] "Generic (PLEG): container finished" podID="9250202b-1093-4e24-a2c7-0c907f458986" containerID="62d7e5a1a3ccedd765fef794043d044f75b0ce190b59517d184723aa63410009" exitCode=0 Oct 13 18:24:11 crc kubenswrapper[4974]: I1013 18:24:11.554819 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" event={"ID":"9250202b-1093-4e24-a2c7-0c907f458986","Type":"ContainerDied","Data":"62d7e5a1a3ccedd765fef794043d044f75b0ce190b59517d184723aa63410009"} Oct 13 18:24:12 crc kubenswrapper[4974]: I1013 18:24:12.567635 4974 generic.go:334] "Generic (PLEG): container finished" podID="9250202b-1093-4e24-a2c7-0c907f458986" containerID="b1769963fd23978c01a7bf7153d717871a3ee00e73b9c64423c71f71ee7d6fb7" exitCode=0 Oct 13 18:24:12 crc kubenswrapper[4974]: I1013 18:24:12.567792 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" event={"ID":"9250202b-1093-4e24-a2c7-0c907f458986","Type":"ContainerDied","Data":"b1769963fd23978c01a7bf7153d717871a3ee00e73b9c64423c71f71ee7d6fb7"} Oct 13 18:24:13 crc kubenswrapper[4974]: I1013 18:24:13.847260 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:14 crc kubenswrapper[4974]: I1013 18:24:14.035086 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24vpx\" (UniqueName: \"kubernetes.io/projected/9250202b-1093-4e24-a2c7-0c907f458986-kube-api-access-24vpx\") pod \"9250202b-1093-4e24-a2c7-0c907f458986\" (UID: \"9250202b-1093-4e24-a2c7-0c907f458986\") " Oct 13 18:24:14 crc kubenswrapper[4974]: I1013 18:24:14.035422 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9250202b-1093-4e24-a2c7-0c907f458986-util\") pod \"9250202b-1093-4e24-a2c7-0c907f458986\" (UID: \"9250202b-1093-4e24-a2c7-0c907f458986\") " Oct 13 18:24:14 crc kubenswrapper[4974]: I1013 18:24:14.035815 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9250202b-1093-4e24-a2c7-0c907f458986-bundle\") pod \"9250202b-1093-4e24-a2c7-0c907f458986\" (UID: \"9250202b-1093-4e24-a2c7-0c907f458986\") " Oct 13 18:24:14 crc kubenswrapper[4974]: I1013 18:24:14.039682 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9250202b-1093-4e24-a2c7-0c907f458986-bundle" (OuterVolumeSpecName: "bundle") pod "9250202b-1093-4e24-a2c7-0c907f458986" (UID: "9250202b-1093-4e24-a2c7-0c907f458986"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:24:14 crc kubenswrapper[4974]: I1013 18:24:14.049759 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9250202b-1093-4e24-a2c7-0c907f458986-kube-api-access-24vpx" (OuterVolumeSpecName: "kube-api-access-24vpx") pod "9250202b-1093-4e24-a2c7-0c907f458986" (UID: "9250202b-1093-4e24-a2c7-0c907f458986"). InnerVolumeSpecName "kube-api-access-24vpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:24:14 crc kubenswrapper[4974]: I1013 18:24:14.061303 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9250202b-1093-4e24-a2c7-0c907f458986-util" (OuterVolumeSpecName: "util") pod "9250202b-1093-4e24-a2c7-0c907f458986" (UID: "9250202b-1093-4e24-a2c7-0c907f458986"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:24:14 crc kubenswrapper[4974]: I1013 18:24:14.137069 4974 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9250202b-1093-4e24-a2c7-0c907f458986-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:24:14 crc kubenswrapper[4974]: I1013 18:24:14.137302 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24vpx\" (UniqueName: \"kubernetes.io/projected/9250202b-1093-4e24-a2c7-0c907f458986-kube-api-access-24vpx\") on node \"crc\" DevicePath \"\"" Oct 13 18:24:14 crc kubenswrapper[4974]: I1013 18:24:14.137360 4974 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9250202b-1093-4e24-a2c7-0c907f458986-util\") on node \"crc\" DevicePath \"\"" Oct 13 18:24:14 crc kubenswrapper[4974]: I1013 18:24:14.587538 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" event={"ID":"9250202b-1093-4e24-a2c7-0c907f458986","Type":"ContainerDied","Data":"13c2697be651c03bc02aeb7d55aa6315c989b2a9ca6ddbc86d3f81acbddf6c56"} Oct 13 18:24:14 crc kubenswrapper[4974]: I1013 18:24:14.588085 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13c2697be651c03bc02aeb7d55aa6315c989b2a9ca6ddbc86d3f81acbddf6c56" Oct 13 18:24:14 crc kubenswrapper[4974]: I1013 18:24:14.587695 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs" Oct 13 18:24:24 crc kubenswrapper[4974]: I1013 18:24:24.938634 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-qc6nj"] Oct 13 18:24:24 crc kubenswrapper[4974]: E1013 18:24:24.939935 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9250202b-1093-4e24-a2c7-0c907f458986" containerName="util" Oct 13 18:24:24 crc kubenswrapper[4974]: I1013 18:24:24.939957 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="9250202b-1093-4e24-a2c7-0c907f458986" containerName="util" Oct 13 18:24:24 crc kubenswrapper[4974]: E1013 18:24:24.939982 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9250202b-1093-4e24-a2c7-0c907f458986" containerName="pull" Oct 13 18:24:24 crc kubenswrapper[4974]: I1013 18:24:24.939990 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="9250202b-1093-4e24-a2c7-0c907f458986" containerName="pull" Oct 13 18:24:24 crc kubenswrapper[4974]: E1013 18:24:24.940008 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9250202b-1093-4e24-a2c7-0c907f458986" containerName="extract" Oct 13 18:24:24 crc kubenswrapper[4974]: I1013 18:24:24.940016 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="9250202b-1093-4e24-a2c7-0c907f458986" containerName="extract" Oct 13 18:24:24 crc kubenswrapper[4974]: I1013 18:24:24.940141 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="9250202b-1093-4e24-a2c7-0c907f458986" containerName="extract" Oct 13 18:24:24 crc kubenswrapper[4974]: I1013 18:24:24.940773 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qc6nj" Oct 13 18:24:24 crc kubenswrapper[4974]: I1013 18:24:24.944320 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Oct 13 18:24:24 crc kubenswrapper[4974]: I1013 18:24:24.944881 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Oct 13 18:24:24 crc kubenswrapper[4974]: I1013 18:24:24.945369 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-85vpc" Oct 13 18:24:24 crc kubenswrapper[4974]: I1013 18:24:24.961740 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-qc6nj"] Oct 13 18:24:24 crc kubenswrapper[4974]: I1013 18:24:24.991492 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lcp4\" (UniqueName: \"kubernetes.io/projected/bf1f0f19-a1c6-4d16-8876-a70c018e0452-kube-api-access-8lcp4\") pod \"obo-prometheus-operator-7c8cf85677-qc6nj\" (UID: \"bf1f0f19-a1c6-4d16-8876-a70c018e0452\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qc6nj" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.062252 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg"] Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.062894 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.065253 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-nldqn" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.065568 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.078933 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt"] Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.079573 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.093197 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lcp4\" (UniqueName: \"kubernetes.io/projected/bf1f0f19-a1c6-4d16-8876-a70c018e0452-kube-api-access-8lcp4\") pod \"obo-prometheus-operator-7c8cf85677-qc6nj\" (UID: \"bf1f0f19-a1c6-4d16-8876-a70c018e0452\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qc6nj" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.093302 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/33d95548-42f2-4bde-88eb-23cfd6a5c5c0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67d85487b6-5hftg\" (UID: \"33d95548-42f2-4bde-88eb-23cfd6a5c5c0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.093372 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/33d95548-42f2-4bde-88eb-23cfd6a5c5c0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67d85487b6-5hftg\" (UID: \"33d95548-42f2-4bde-88eb-23cfd6a5c5c0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.093401 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cffd65cb-eb29-48d8-b634-4e535b39ce51-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt\" (UID: \"cffd65cb-eb29-48d8-b634-4e535b39ce51\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.093491 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cffd65cb-eb29-48d8-b634-4e535b39ce51-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt\" (UID: \"cffd65cb-eb29-48d8-b634-4e535b39ce51\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.100352 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt"] Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.149342 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lcp4\" (UniqueName: \"kubernetes.io/projected/bf1f0f19-a1c6-4d16-8876-a70c018e0452-kube-api-access-8lcp4\") pod \"obo-prometheus-operator-7c8cf85677-qc6nj\" (UID: \"bf1f0f19-a1c6-4d16-8876-a70c018e0452\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qc6nj" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.159889 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg"] Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.193961 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/33d95548-42f2-4bde-88eb-23cfd6a5c5c0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67d85487b6-5hftg\" (UID: \"33d95548-42f2-4bde-88eb-23cfd6a5c5c0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.194006 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cffd65cb-eb29-48d8-b634-4e535b39ce51-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt\" (UID: \"cffd65cb-eb29-48d8-b634-4e535b39ce51\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.194039 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cffd65cb-eb29-48d8-b634-4e535b39ce51-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt\" (UID: \"cffd65cb-eb29-48d8-b634-4e535b39ce51\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.194102 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/33d95548-42f2-4bde-88eb-23cfd6a5c5c0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67d85487b6-5hftg\" (UID: \"33d95548-42f2-4bde-88eb-23cfd6a5c5c0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.199736 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cffd65cb-eb29-48d8-b634-4e535b39ce51-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt\" (UID: \"cffd65cb-eb29-48d8-b634-4e535b39ce51\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.199753 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cffd65cb-eb29-48d8-b634-4e535b39ce51-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt\" (UID: \"cffd65cb-eb29-48d8-b634-4e535b39ce51\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.200920 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/33d95548-42f2-4bde-88eb-23cfd6a5c5c0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67d85487b6-5hftg\" (UID: \"33d95548-42f2-4bde-88eb-23cfd6a5c5c0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.201313 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/33d95548-42f2-4bde-88eb-23cfd6a5c5c0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67d85487b6-5hftg\" (UID: \"33d95548-42f2-4bde-88eb-23cfd6a5c5c0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.257641 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qc6nj" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.291820 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-zqdvp"] Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.292493 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.294672 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.295436 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-669nz" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.341904 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-zqdvp"] Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.376686 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.392969 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.396132 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zmlx\" (UniqueName: \"kubernetes.io/projected/adeddb78-abbb-494b-b723-d3ed7a66503f-kube-api-access-4zmlx\") pod \"observability-operator-cc5f78dfc-zqdvp\" (UID: \"adeddb78-abbb-494b-b723-d3ed7a66503f\") " pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.400779 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/adeddb78-abbb-494b-b723-d3ed7a66503f-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-zqdvp\" (UID: \"adeddb78-abbb-494b-b723-d3ed7a66503f\") " pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.490887 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-8mc8w"] Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.491740 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.497958 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-tr8fr" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.513292 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-8mc8w\" (UID: \"bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471\") " pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.513362 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zmlx\" (UniqueName: \"kubernetes.io/projected/adeddb78-abbb-494b-b723-d3ed7a66503f-kube-api-access-4zmlx\") pod \"observability-operator-cc5f78dfc-zqdvp\" (UID: \"adeddb78-abbb-494b-b723-d3ed7a66503f\") " pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.513390 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtjdd\" (UniqueName: \"kubernetes.io/projected/bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471-kube-api-access-dtjdd\") pod \"perses-operator-54bc95c9fb-8mc8w\" (UID: \"bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471\") " pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.513426 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/adeddb78-abbb-494b-b723-d3ed7a66503f-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-zqdvp\" (UID: \"adeddb78-abbb-494b-b723-d3ed7a66503f\") " pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.544768 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/adeddb78-abbb-494b-b723-d3ed7a66503f-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-zqdvp\" (UID: \"adeddb78-abbb-494b-b723-d3ed7a66503f\") " pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.557346 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-8mc8w"] Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.572657 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zmlx\" (UniqueName: \"kubernetes.io/projected/adeddb78-abbb-494b-b723-d3ed7a66503f-kube-api-access-4zmlx\") pod \"observability-operator-cc5f78dfc-zqdvp\" (UID: \"adeddb78-abbb-494b-b723-d3ed7a66503f\") " pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.619663 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-8mc8w\" (UID: \"bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471\") " pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.619722 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtjdd\" (UniqueName: \"kubernetes.io/projected/bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471-kube-api-access-dtjdd\") pod \"perses-operator-54bc95c9fb-8mc8w\" (UID: \"bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471\") " pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.620891 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-8mc8w\" (UID: \"bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471\") " pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.621163 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.650717 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtjdd\" (UniqueName: \"kubernetes.io/projected/bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471-kube-api-access-dtjdd\") pod \"perses-operator-54bc95c9fb-8mc8w\" (UID: \"bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471\") " pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.746515 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg"] Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.770202 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-qc6nj"] Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.870097 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-tr8fr" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.875393 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" Oct 13 18:24:25 crc kubenswrapper[4974]: I1013 18:24:25.907557 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt"] Oct 13 18:24:25 crc kubenswrapper[4974]: W1013 18:24:25.919815 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcffd65cb_eb29_48d8_b634_4e535b39ce51.slice/crio-942f3709197c969454e7d5fee76bcb000f30926d78256ed3f74f9e12661ed839 WatchSource:0}: Error finding container 942f3709197c969454e7d5fee76bcb000f30926d78256ed3f74f9e12661ed839: Status 404 returned error can't find the container with id 942f3709197c969454e7d5fee76bcb000f30926d78256ed3f74f9e12661ed839 Oct 13 18:24:26 crc kubenswrapper[4974]: I1013 18:24:26.001502 4974 scope.go:117] "RemoveContainer" containerID="e9fee19196e60339da8bcf866619d7ffb5a207e7b5ea33c687d1383cf62f7d94" Oct 13 18:24:26 crc kubenswrapper[4974]: I1013 18:24:26.151919 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-zqdvp"] Oct 13 18:24:26 crc kubenswrapper[4974]: I1013 18:24:26.486414 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-8mc8w"] Oct 13 18:24:26 crc kubenswrapper[4974]: W1013 18:24:26.496044 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcccc6ab_47aa_4c17_88e0_fb0ed3ea5471.slice/crio-6d8d5936ed8836db0bfe4351b90a8b8567b64a57fb55226136fb7780f35b1565 WatchSource:0}: Error finding container 6d8d5936ed8836db0bfe4351b90a8b8567b64a57fb55226136fb7780f35b1565: Status 404 returned error can't find the container with id 6d8d5936ed8836db0bfe4351b90a8b8567b64a57fb55226136fb7780f35b1565 Oct 13 18:24:26 crc kubenswrapper[4974]: I1013 18:24:26.670370 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt" event={"ID":"cffd65cb-eb29-48d8-b634-4e535b39ce51","Type":"ContainerStarted","Data":"942f3709197c969454e7d5fee76bcb000f30926d78256ed3f74f9e12661ed839"} Oct 13 18:24:26 crc kubenswrapper[4974]: I1013 18:24:26.671682 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg" event={"ID":"33d95548-42f2-4bde-88eb-23cfd6a5c5c0","Type":"ContainerStarted","Data":"8a8dfe072ad26f32c2e47c7e98eb61206057caf3956984e3b70a4956da011b97"} Oct 13 18:24:26 crc kubenswrapper[4974]: I1013 18:24:26.672655 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qc6nj" event={"ID":"bf1f0f19-a1c6-4d16-8876-a70c018e0452","Type":"ContainerStarted","Data":"12f79aaabdf3bdbc3733b48b6a5c40699422a75eeaae7d68ef40c43949165cdb"} Oct 13 18:24:26 crc kubenswrapper[4974]: I1013 18:24:26.674438 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xcspx_9c38c0e3-9bee-402b-adf0-27ac9e31c0f0/kube-multus/2.log" Oct 13 18:24:26 crc kubenswrapper[4974]: I1013 18:24:26.675624 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" event={"ID":"adeddb78-abbb-494b-b723-d3ed7a66503f","Type":"ContainerStarted","Data":"47aae608bae1416d51d2b7e075ef8b43162f0d2c94f5584f445625cfc60b0f47"} Oct 13 18:24:26 crc kubenswrapper[4974]: I1013 18:24:26.676736 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" event={"ID":"bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471","Type":"ContainerStarted","Data":"6d8d5936ed8836db0bfe4351b90a8b8567b64a57fb55226136fb7780f35b1565"} Oct 13 18:24:37 crc kubenswrapper[4974]: I1013 18:24:37.743321 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:24:37 crc kubenswrapper[4974]: I1013 18:24:37.743976 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.753243 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt" event={"ID":"cffd65cb-eb29-48d8-b634-4e535b39ce51","Type":"ContainerStarted","Data":"0b1cd0f73f658c8278414b548a3e71570643363e57f302b0cf745d3d7e8ea01d"} Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.756134 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg" event={"ID":"33d95548-42f2-4bde-88eb-23cfd6a5c5c0","Type":"ContainerStarted","Data":"c7b9344988256247e0f9d5b99ca5aed3f809bafa022ca824ec71af1060da7ce8"} Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.760378 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" event={"ID":"adeddb78-abbb-494b-b723-d3ed7a66503f","Type":"ContainerStarted","Data":"9c68f1f32792c2749895ded4cc4f3072d72a4834ade4c01d0bc6d6e8b406c956"} Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.760952 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.762278 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" event={"ID":"bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471","Type":"ContainerStarted","Data":"390d54ad5688f1819a6b5fe9644e5dc262bde3645ef5c62f883015bb7286b47a"} Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.762892 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.763935 4974 patch_prober.go:28] interesting pod/observability-operator-cc5f78dfc-zqdvp container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.44:8081/healthz\": dial tcp 10.217.0.44:8081: connect: connection refused" start-of-body= Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.764049 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" podUID="adeddb78-abbb-494b-b723-d3ed7a66503f" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.44:8081/healthz\": dial tcp 10.217.0.44:8081: connect: connection refused" Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.774343 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt" podStartSLOduration=1.3314556309999999 podStartE2EDuration="13.774313336s" podCreationTimestamp="2025-10-13 18:24:25 +0000 UTC" firstStartedPulling="2025-10-13 18:24:25.923927405 +0000 UTC m=+600.828293485" lastFinishedPulling="2025-10-13 18:24:38.3667851 +0000 UTC m=+613.271151190" observedRunningTime="2025-10-13 18:24:38.772564286 +0000 UTC m=+613.676930406" watchObservedRunningTime="2025-10-13 18:24:38.774313336 +0000 UTC m=+613.678679416" Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.835522 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" podStartSLOduration=1.557723932 podStartE2EDuration="13.835496249s" podCreationTimestamp="2025-10-13 18:24:25 +0000 UTC" firstStartedPulling="2025-10-13 18:24:26.1711278 +0000 UTC m=+601.075493880" lastFinishedPulling="2025-10-13 18:24:38.448900107 +0000 UTC m=+613.353266197" observedRunningTime="2025-10-13 18:24:38.834071668 +0000 UTC m=+613.738437768" watchObservedRunningTime="2025-10-13 18:24:38.835496249 +0000 UTC m=+613.739862319" Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.836028 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" podStartSLOduration=1.919604293 podStartE2EDuration="13.836019045s" podCreationTimestamp="2025-10-13 18:24:25 +0000 UTC" firstStartedPulling="2025-10-13 18:24:26.49831235 +0000 UTC m=+601.402678430" lastFinishedPulling="2025-10-13 18:24:38.414727102 +0000 UTC m=+613.319093182" observedRunningTime="2025-10-13 18:24:38.801398997 +0000 UTC m=+613.705765087" watchObservedRunningTime="2025-10-13 18:24:38.836019045 +0000 UTC m=+613.740385125" Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.868221 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d85487b6-5hftg" podStartSLOduration=1.271261937 podStartE2EDuration="13.868199052s" podCreationTimestamp="2025-10-13 18:24:25 +0000 UTC" firstStartedPulling="2025-10-13 18:24:25.791311644 +0000 UTC m=+600.695677724" lastFinishedPulling="2025-10-13 18:24:38.388248759 +0000 UTC m=+613.292614839" observedRunningTime="2025-10-13 18:24:38.867203553 +0000 UTC m=+613.771569663" watchObservedRunningTime="2025-10-13 18:24:38.868199052 +0000 UTC m=+613.772565142" Oct 13 18:24:38 crc kubenswrapper[4974]: I1013 18:24:38.892852 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qc6nj" podStartSLOduration=2.315271095 podStartE2EDuration="14.892830872s" podCreationTimestamp="2025-10-13 18:24:24 +0000 UTC" firstStartedPulling="2025-10-13 18:24:25.83733673 +0000 UTC m=+600.741702810" lastFinishedPulling="2025-10-13 18:24:38.414896507 +0000 UTC m=+613.319262587" observedRunningTime="2025-10-13 18:24:38.891285907 +0000 UTC m=+613.795651987" watchObservedRunningTime="2025-10-13 18:24:38.892830872 +0000 UTC m=+613.797196942" Oct 13 18:24:39 crc kubenswrapper[4974]: I1013 18:24:39.769813 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-qc6nj" event={"ID":"bf1f0f19-a1c6-4d16-8876-a70c018e0452","Type":"ContainerStarted","Data":"80d31fd1b305a9723051d668fe6dc79d18482a484b278b1894c38009041f276a"} Oct 13 18:24:39 crc kubenswrapper[4974]: I1013 18:24:39.771898 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-zqdvp" Oct 13 18:24:45 crc kubenswrapper[4974]: I1013 18:24:45.878638 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-8mc8w" Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.776889 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt"] Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.779897 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.783125 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.789644 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt"] Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.847166 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67xfp\" (UniqueName: \"kubernetes.io/projected/40c8dc6f-22e8-44bb-818f-1861ac1566cf-kube-api-access-67xfp\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt\" (UID: \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.847306 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40c8dc6f-22e8-44bb-818f-1861ac1566cf-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt\" (UID: \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.847358 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40c8dc6f-22e8-44bb-818f-1861ac1566cf-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt\" (UID: \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.948866 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67xfp\" (UniqueName: \"kubernetes.io/projected/40c8dc6f-22e8-44bb-818f-1861ac1566cf-kube-api-access-67xfp\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt\" (UID: \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.948945 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40c8dc6f-22e8-44bb-818f-1861ac1566cf-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt\" (UID: \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.948972 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40c8dc6f-22e8-44bb-818f-1861ac1566cf-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt\" (UID: \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.949596 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40c8dc6f-22e8-44bb-818f-1861ac1566cf-util\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt\" (UID: \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.950153 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40c8dc6f-22e8-44bb-818f-1861ac1566cf-bundle\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt\" (UID: \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:03 crc kubenswrapper[4974]: I1013 18:25:03.980376 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67xfp\" (UniqueName: \"kubernetes.io/projected/40c8dc6f-22e8-44bb-818f-1861ac1566cf-kube-api-access-67xfp\") pod \"fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt\" (UID: \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\") " pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:04 crc kubenswrapper[4974]: I1013 18:25:04.104533 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:04 crc kubenswrapper[4974]: I1013 18:25:04.353985 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt"] Oct 13 18:25:04 crc kubenswrapper[4974]: W1013 18:25:04.360100 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40c8dc6f_22e8_44bb_818f_1861ac1566cf.slice/crio-b6a7587fa29934d7763808b28d4aa1c2f74966145dbbd179ae91849422332c28 WatchSource:0}: Error finding container b6a7587fa29934d7763808b28d4aa1c2f74966145dbbd179ae91849422332c28: Status 404 returned error can't find the container with id b6a7587fa29934d7763808b28d4aa1c2f74966145dbbd179ae91849422332c28 Oct 13 18:25:04 crc kubenswrapper[4974]: I1013 18:25:04.930467 4974 generic.go:334] "Generic (PLEG): container finished" podID="40c8dc6f-22e8-44bb-818f-1861ac1566cf" containerID="99850d0d8a7b8872ac185620bcb0df8e9cf380cdca0dc647595e4b3dd5f77bde" exitCode=0 Oct 13 18:25:04 crc kubenswrapper[4974]: I1013 18:25:04.930614 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" event={"ID":"40c8dc6f-22e8-44bb-818f-1861ac1566cf","Type":"ContainerDied","Data":"99850d0d8a7b8872ac185620bcb0df8e9cf380cdca0dc647595e4b3dd5f77bde"} Oct 13 18:25:04 crc kubenswrapper[4974]: I1013 18:25:04.930869 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" event={"ID":"40c8dc6f-22e8-44bb-818f-1861ac1566cf","Type":"ContainerStarted","Data":"b6a7587fa29934d7763808b28d4aa1c2f74966145dbbd179ae91849422332c28"} Oct 13 18:25:07 crc kubenswrapper[4974]: I1013 18:25:07.743254 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:25:07 crc kubenswrapper[4974]: I1013 18:25:07.743694 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:25:07 crc kubenswrapper[4974]: I1013 18:25:07.743765 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:25:07 crc kubenswrapper[4974]: I1013 18:25:07.744610 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8894b6a4641af63269e4147cdef54f2c3c1eada1368d28e6c504cfd79085b430"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 18:25:07 crc kubenswrapper[4974]: I1013 18:25:07.744749 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://8894b6a4641af63269e4147cdef54f2c3c1eada1368d28e6c504cfd79085b430" gracePeriod=600 Oct 13 18:25:09 crc kubenswrapper[4974]: I1013 18:25:09.966105 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="8894b6a4641af63269e4147cdef54f2c3c1eada1368d28e6c504cfd79085b430" exitCode=0 Oct 13 18:25:09 crc kubenswrapper[4974]: I1013 18:25:09.966179 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"8894b6a4641af63269e4147cdef54f2c3c1eada1368d28e6c504cfd79085b430"} Oct 13 18:25:09 crc kubenswrapper[4974]: I1013 18:25:09.966513 4974 scope.go:117] "RemoveContainer" containerID="29dc800fef44c866f1bfd3f3e803022d871dfc0b9f20d8a971dfc51b409fc8b0" Oct 13 18:25:10 crc kubenswrapper[4974]: I1013 18:25:10.978489 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"436597ac77fb62acd2a6755e030d52c331e47424c1685e9a911b5a1473f796ca"} Oct 13 18:25:10 crc kubenswrapper[4974]: I1013 18:25:10.983557 4974 generic.go:334] "Generic (PLEG): container finished" podID="40c8dc6f-22e8-44bb-818f-1861ac1566cf" containerID="2b464f48d715b327dbb5695abbe6129b9ff5a4694bd38690d83a0a62c6747974" exitCode=0 Oct 13 18:25:10 crc kubenswrapper[4974]: I1013 18:25:10.983595 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" event={"ID":"40c8dc6f-22e8-44bb-818f-1861ac1566cf","Type":"ContainerDied","Data":"2b464f48d715b327dbb5695abbe6129b9ff5a4694bd38690d83a0a62c6747974"} Oct 13 18:25:11 crc kubenswrapper[4974]: I1013 18:25:11.994248 4974 generic.go:334] "Generic (PLEG): container finished" podID="40c8dc6f-22e8-44bb-818f-1861ac1566cf" containerID="0a56d9ef510168b766f429a7fa0c3849d8d730e40ac56230925fe38a38b000c0" exitCode=0 Oct 13 18:25:11 crc kubenswrapper[4974]: I1013 18:25:11.994300 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" event={"ID":"40c8dc6f-22e8-44bb-818f-1861ac1566cf","Type":"ContainerDied","Data":"0a56d9ef510168b766f429a7fa0c3849d8d730e40ac56230925fe38a38b000c0"} Oct 13 18:25:13 crc kubenswrapper[4974]: I1013 18:25:13.390738 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:13 crc kubenswrapper[4974]: I1013 18:25:13.479724 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40c8dc6f-22e8-44bb-818f-1861ac1566cf-bundle\") pod \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\" (UID: \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\") " Oct 13 18:25:13 crc kubenswrapper[4974]: I1013 18:25:13.479827 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40c8dc6f-22e8-44bb-818f-1861ac1566cf-util\") pod \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\" (UID: \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\") " Oct 13 18:25:13 crc kubenswrapper[4974]: I1013 18:25:13.479949 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67xfp\" (UniqueName: \"kubernetes.io/projected/40c8dc6f-22e8-44bb-818f-1861ac1566cf-kube-api-access-67xfp\") pod \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\" (UID: \"40c8dc6f-22e8-44bb-818f-1861ac1566cf\") " Oct 13 18:25:13 crc kubenswrapper[4974]: I1013 18:25:13.481792 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c8dc6f-22e8-44bb-818f-1861ac1566cf-bundle" (OuterVolumeSpecName: "bundle") pod "40c8dc6f-22e8-44bb-818f-1861ac1566cf" (UID: "40c8dc6f-22e8-44bb-818f-1861ac1566cf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:25:13 crc kubenswrapper[4974]: I1013 18:25:13.487391 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40c8dc6f-22e8-44bb-818f-1861ac1566cf-kube-api-access-67xfp" (OuterVolumeSpecName: "kube-api-access-67xfp") pod "40c8dc6f-22e8-44bb-818f-1861ac1566cf" (UID: "40c8dc6f-22e8-44bb-818f-1861ac1566cf"). InnerVolumeSpecName "kube-api-access-67xfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:25:13 crc kubenswrapper[4974]: I1013 18:25:13.503590 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40c8dc6f-22e8-44bb-818f-1861ac1566cf-util" (OuterVolumeSpecName: "util") pod "40c8dc6f-22e8-44bb-818f-1861ac1566cf" (UID: "40c8dc6f-22e8-44bb-818f-1861ac1566cf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:25:13 crc kubenswrapper[4974]: I1013 18:25:13.582059 4974 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/40c8dc6f-22e8-44bb-818f-1861ac1566cf-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:25:13 crc kubenswrapper[4974]: I1013 18:25:13.582111 4974 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/40c8dc6f-22e8-44bb-818f-1861ac1566cf-util\") on node \"crc\" DevicePath \"\"" Oct 13 18:25:13 crc kubenswrapper[4974]: I1013 18:25:13.582131 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67xfp\" (UniqueName: \"kubernetes.io/projected/40c8dc6f-22e8-44bb-818f-1861ac1566cf-kube-api-access-67xfp\") on node \"crc\" DevicePath \"\"" Oct 13 18:25:14 crc kubenswrapper[4974]: I1013 18:25:14.011868 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" event={"ID":"40c8dc6f-22e8-44bb-818f-1861ac1566cf","Type":"ContainerDied","Data":"b6a7587fa29934d7763808b28d4aa1c2f74966145dbbd179ae91849422332c28"} Oct 13 18:25:14 crc kubenswrapper[4974]: I1013 18:25:14.011912 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6a7587fa29934d7763808b28d4aa1c2f74966145dbbd179ae91849422332c28" Oct 13 18:25:14 crc kubenswrapper[4974]: I1013 18:25:14.011995 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.455054 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-5cgww"] Oct 13 18:25:20 crc kubenswrapper[4974]: E1013 18:25:20.456110 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c8dc6f-22e8-44bb-818f-1861ac1566cf" containerName="extract" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.456141 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c8dc6f-22e8-44bb-818f-1861ac1566cf" containerName="extract" Oct 13 18:25:20 crc kubenswrapper[4974]: E1013 18:25:20.456180 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c8dc6f-22e8-44bb-818f-1861ac1566cf" containerName="pull" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.456198 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c8dc6f-22e8-44bb-818f-1861ac1566cf" containerName="pull" Oct 13 18:25:20 crc kubenswrapper[4974]: E1013 18:25:20.456225 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40c8dc6f-22e8-44bb-818f-1861ac1566cf" containerName="util" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.456240 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="40c8dc6f-22e8-44bb-818f-1861ac1566cf" containerName="util" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.456483 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="40c8dc6f-22e8-44bb-818f-1861ac1566cf" containerName="extract" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.457202 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5cgww" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.459067 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-xzbdg" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.459473 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.460441 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.468221 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-5cgww"] Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.566893 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq8mx\" (UniqueName: \"kubernetes.io/projected/b8f48940-751e-4e3c-98a3-d29c1b73e776-kube-api-access-rq8mx\") pod \"nmstate-operator-858ddd8f98-5cgww\" (UID: \"b8f48940-751e-4e3c-98a3-d29c1b73e776\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-5cgww" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.667898 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rq8mx\" (UniqueName: \"kubernetes.io/projected/b8f48940-751e-4e3c-98a3-d29c1b73e776-kube-api-access-rq8mx\") pod \"nmstate-operator-858ddd8f98-5cgww\" (UID: \"b8f48940-751e-4e3c-98a3-d29c1b73e776\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-5cgww" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.685647 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq8mx\" (UniqueName: \"kubernetes.io/projected/b8f48940-751e-4e3c-98a3-d29c1b73e776-kube-api-access-rq8mx\") pod \"nmstate-operator-858ddd8f98-5cgww\" (UID: \"b8f48940-751e-4e3c-98a3-d29c1b73e776\") " pod="openshift-nmstate/nmstate-operator-858ddd8f98-5cgww" Oct 13 18:25:20 crc kubenswrapper[4974]: I1013 18:25:20.776921 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5cgww" Oct 13 18:25:21 crc kubenswrapper[4974]: I1013 18:25:21.031136 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-858ddd8f98-5cgww"] Oct 13 18:25:21 crc kubenswrapper[4974]: I1013 18:25:21.062392 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5cgww" event={"ID":"b8f48940-751e-4e3c-98a3-d29c1b73e776","Type":"ContainerStarted","Data":"c0eb5bcb779e592c8f4fb0f3ea7ea08f33a461319a0af2e5d4e555e91fe28bab"} Oct 13 18:25:24 crc kubenswrapper[4974]: I1013 18:25:24.083475 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5cgww" event={"ID":"b8f48940-751e-4e3c-98a3-d29c1b73e776","Type":"ContainerStarted","Data":"22fef31cd65693a528983a4f7e63205fde171563a29e6d67cba9639ec59a7cf1"} Oct 13 18:25:24 crc kubenswrapper[4974]: I1013 18:25:24.114256 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-858ddd8f98-5cgww" podStartSLOduration=1.9613649720000002 podStartE2EDuration="4.114220841s" podCreationTimestamp="2025-10-13 18:25:20 +0000 UTC" firstStartedPulling="2025-10-13 18:25:21.049082348 +0000 UTC m=+655.953448428" lastFinishedPulling="2025-10-13 18:25:23.201938217 +0000 UTC m=+658.106304297" observedRunningTime="2025-10-13 18:25:24.107797206 +0000 UTC m=+659.012163346" watchObservedRunningTime="2025-10-13 18:25:24.114220841 +0000 UTC m=+659.018586951" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.071533 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-zg86t"] Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.088107 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-zg86t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.092180 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-klrqq" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.099087 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2"] Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.099849 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.103269 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-zg86t"] Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.121106 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.125452 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2"] Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.128162 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ded07897-9b9c-4548-a909-02c623167912-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-rkbf2\" (UID: \"ded07897-9b9c-4548-a909-02c623167912\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.128202 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qckd6\" (UniqueName: \"kubernetes.io/projected/16f35fcb-c349-4306-a4ee-306dfff9a8f1-kube-api-access-qckd6\") pod \"nmstate-metrics-fdff9cb8d-zg86t\" (UID: \"16f35fcb-c349-4306-a4ee-306dfff9a8f1\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-zg86t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.128241 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc5t4\" (UniqueName: \"kubernetes.io/projected/ded07897-9b9c-4548-a909-02c623167912-kube-api-access-mc5t4\") pod \"nmstate-webhook-6cdbc54649-rkbf2\" (UID: \"ded07897-9b9c-4548-a909-02c623167912\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.137312 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-skb5t"] Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.137984 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.223055 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz"] Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.223768 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.225949 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.226082 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-cwpzw" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.226120 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.229203 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ded07897-9b9c-4548-a909-02c623167912-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-rkbf2\" (UID: \"ded07897-9b9c-4548-a909-02c623167912\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.229241 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qckd6\" (UniqueName: \"kubernetes.io/projected/16f35fcb-c349-4306-a4ee-306dfff9a8f1-kube-api-access-qckd6\") pod \"nmstate-metrics-fdff9cb8d-zg86t\" (UID: \"16f35fcb-c349-4306-a4ee-306dfff9a8f1\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-zg86t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.229280 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4ktk\" (UniqueName: \"kubernetes.io/projected/51e7a340-7c32-4fae-b22f-2dd321f0afc1-kube-api-access-d4ktk\") pod \"nmstate-handler-skb5t\" (UID: \"51e7a340-7c32-4fae-b22f-2dd321f0afc1\") " pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.229304 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc5t4\" (UniqueName: \"kubernetes.io/projected/ded07897-9b9c-4548-a909-02c623167912-kube-api-access-mc5t4\") pod \"nmstate-webhook-6cdbc54649-rkbf2\" (UID: \"ded07897-9b9c-4548-a909-02c623167912\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.229319 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/51e7a340-7c32-4fae-b22f-2dd321f0afc1-dbus-socket\") pod \"nmstate-handler-skb5t\" (UID: \"51e7a340-7c32-4fae-b22f-2dd321f0afc1\") " pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.229337 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/51e7a340-7c32-4fae-b22f-2dd321f0afc1-nmstate-lock\") pod \"nmstate-handler-skb5t\" (UID: \"51e7a340-7c32-4fae-b22f-2dd321f0afc1\") " pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.229366 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/51e7a340-7c32-4fae-b22f-2dd321f0afc1-ovs-socket\") pod \"nmstate-handler-skb5t\" (UID: \"51e7a340-7c32-4fae-b22f-2dd321f0afc1\") " pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: E1013 18:25:25.229633 4974 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Oct 13 18:25:25 crc kubenswrapper[4974]: E1013 18:25:25.229686 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ded07897-9b9c-4548-a909-02c623167912-tls-key-pair podName:ded07897-9b9c-4548-a909-02c623167912 nodeName:}" failed. No retries permitted until 2025-10-13 18:25:25.72967111 +0000 UTC m=+660.634037190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/ded07897-9b9c-4548-a909-02c623167912-tls-key-pair") pod "nmstate-webhook-6cdbc54649-rkbf2" (UID: "ded07897-9b9c-4548-a909-02c623167912") : secret "openshift-nmstate-webhook" not found Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.234503 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz"] Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.247546 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc5t4\" (UniqueName: \"kubernetes.io/projected/ded07897-9b9c-4548-a909-02c623167912-kube-api-access-mc5t4\") pod \"nmstate-webhook-6cdbc54649-rkbf2\" (UID: \"ded07897-9b9c-4548-a909-02c623167912\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.255816 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qckd6\" (UniqueName: \"kubernetes.io/projected/16f35fcb-c349-4306-a4ee-306dfff9a8f1-kube-api-access-qckd6\") pod \"nmstate-metrics-fdff9cb8d-zg86t\" (UID: \"16f35fcb-c349-4306-a4ee-306dfff9a8f1\") " pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-zg86t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.330809 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/51e7a340-7c32-4fae-b22f-2dd321f0afc1-dbus-socket\") pod \"nmstate-handler-skb5t\" (UID: \"51e7a340-7c32-4fae-b22f-2dd321f0afc1\") " pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.331185 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/51e7a340-7c32-4fae-b22f-2dd321f0afc1-nmstate-lock\") pod \"nmstate-handler-skb5t\" (UID: \"51e7a340-7c32-4fae-b22f-2dd321f0afc1\") " pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.331218 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/51e7a340-7c32-4fae-b22f-2dd321f0afc1-ovs-socket\") pod \"nmstate-handler-skb5t\" (UID: \"51e7a340-7c32-4fae-b22f-2dd321f0afc1\") " pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.331268 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4d9c8026-13ca-4df7-8bfc-d36594573e26-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-rx8hz\" (UID: \"4d9c8026-13ca-4df7-8bfc-d36594573e26\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.331293 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhvd6\" (UniqueName: \"kubernetes.io/projected/4d9c8026-13ca-4df7-8bfc-d36594573e26-kube-api-access-bhvd6\") pod \"nmstate-console-plugin-6b874cbd85-rx8hz\" (UID: \"4d9c8026-13ca-4df7-8bfc-d36594573e26\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.331334 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d9c8026-13ca-4df7-8bfc-d36594573e26-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-rx8hz\" (UID: \"4d9c8026-13ca-4df7-8bfc-d36594573e26\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.331397 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4ktk\" (UniqueName: \"kubernetes.io/projected/51e7a340-7c32-4fae-b22f-2dd321f0afc1-kube-api-access-d4ktk\") pod \"nmstate-handler-skb5t\" (UID: \"51e7a340-7c32-4fae-b22f-2dd321f0afc1\") " pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.331534 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/51e7a340-7c32-4fae-b22f-2dd321f0afc1-ovs-socket\") pod \"nmstate-handler-skb5t\" (UID: \"51e7a340-7c32-4fae-b22f-2dd321f0afc1\") " pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.331622 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/51e7a340-7c32-4fae-b22f-2dd321f0afc1-nmstate-lock\") pod \"nmstate-handler-skb5t\" (UID: \"51e7a340-7c32-4fae-b22f-2dd321f0afc1\") " pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.331849 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/51e7a340-7c32-4fae-b22f-2dd321f0afc1-dbus-socket\") pod \"nmstate-handler-skb5t\" (UID: \"51e7a340-7c32-4fae-b22f-2dd321f0afc1\") " pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.352142 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4ktk\" (UniqueName: \"kubernetes.io/projected/51e7a340-7c32-4fae-b22f-2dd321f0afc1-kube-api-access-d4ktk\") pod \"nmstate-handler-skb5t\" (UID: \"51e7a340-7c32-4fae-b22f-2dd321f0afc1\") " pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.415927 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-zg86t" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.432159 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4d9c8026-13ca-4df7-8bfc-d36594573e26-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-rx8hz\" (UID: \"4d9c8026-13ca-4df7-8bfc-d36594573e26\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.432216 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhvd6\" (UniqueName: \"kubernetes.io/projected/4d9c8026-13ca-4df7-8bfc-d36594573e26-kube-api-access-bhvd6\") pod \"nmstate-console-plugin-6b874cbd85-rx8hz\" (UID: \"4d9c8026-13ca-4df7-8bfc-d36594573e26\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.432243 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d9c8026-13ca-4df7-8bfc-d36594573e26-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-rx8hz\" (UID: \"4d9c8026-13ca-4df7-8bfc-d36594573e26\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.433639 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/4d9c8026-13ca-4df7-8bfc-d36594573e26-nginx-conf\") pod \"nmstate-console-plugin-6b874cbd85-rx8hz\" (UID: \"4d9c8026-13ca-4df7-8bfc-d36594573e26\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.437904 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6fbd896c7d-9lqdg"] Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.438582 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.443314 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/4d9c8026-13ca-4df7-8bfc-d36594573e26-plugin-serving-cert\") pod \"nmstate-console-plugin-6b874cbd85-rx8hz\" (UID: \"4d9c8026-13ca-4df7-8bfc-d36594573e26\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.453264 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhvd6\" (UniqueName: \"kubernetes.io/projected/4d9c8026-13ca-4df7-8bfc-d36594573e26-kube-api-access-bhvd6\") pod \"nmstate-console-plugin-6b874cbd85-rx8hz\" (UID: \"4d9c8026-13ca-4df7-8bfc-d36594573e26\") " pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.456198 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:25 crc kubenswrapper[4974]: W1013 18:25:25.475465 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51e7a340_7c32_4fae_b22f_2dd321f0afc1.slice/crio-56907f9697da11e3634e70dd5ecc7821b86c8aebeb82e266b004986ff7f93f64 WatchSource:0}: Error finding container 56907f9697da11e3634e70dd5ecc7821b86c8aebeb82e266b004986ff7f93f64: Status 404 returned error can't find the container with id 56907f9697da11e3634e70dd5ecc7821b86c8aebeb82e266b004986ff7f93f64 Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.515028 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fbd896c7d-9lqdg"] Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.533624 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42748cf8-71e7-43df-a63d-eeebba063461-service-ca\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.533739 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6zcl\" (UniqueName: \"kubernetes.io/projected/42748cf8-71e7-43df-a63d-eeebba063461-kube-api-access-f6zcl\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.533788 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42748cf8-71e7-43df-a63d-eeebba063461-trusted-ca-bundle\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.533839 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42748cf8-71e7-43df-a63d-eeebba063461-console-serving-cert\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.533897 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42748cf8-71e7-43df-a63d-eeebba063461-console-config\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.533980 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42748cf8-71e7-43df-a63d-eeebba063461-oauth-serving-cert\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.534057 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42748cf8-71e7-43df-a63d-eeebba063461-console-oauth-config\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.543814 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.636365 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42748cf8-71e7-43df-a63d-eeebba063461-console-serving-cert\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.636435 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42748cf8-71e7-43df-a63d-eeebba063461-console-config\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.636475 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42748cf8-71e7-43df-a63d-eeebba063461-oauth-serving-cert\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.636520 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42748cf8-71e7-43df-a63d-eeebba063461-console-oauth-config\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.636541 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42748cf8-71e7-43df-a63d-eeebba063461-service-ca\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.636560 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6zcl\" (UniqueName: \"kubernetes.io/projected/42748cf8-71e7-43df-a63d-eeebba063461-kube-api-access-f6zcl\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.636582 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42748cf8-71e7-43df-a63d-eeebba063461-trusted-ca-bundle\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.637958 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/42748cf8-71e7-43df-a63d-eeebba063461-oauth-serving-cert\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.637961 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/42748cf8-71e7-43df-a63d-eeebba063461-console-config\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.639190 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42748cf8-71e7-43df-a63d-eeebba063461-trusted-ca-bundle\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.639822 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/42748cf8-71e7-43df-a63d-eeebba063461-service-ca\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.645158 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/42748cf8-71e7-43df-a63d-eeebba063461-console-oauth-config\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.647814 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/42748cf8-71e7-43df-a63d-eeebba063461-console-serving-cert\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.652344 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6zcl\" (UniqueName: \"kubernetes.io/projected/42748cf8-71e7-43df-a63d-eeebba063461-kube-api-access-f6zcl\") pod \"console-6fbd896c7d-9lqdg\" (UID: \"42748cf8-71e7-43df-a63d-eeebba063461\") " pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.674035 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-fdff9cb8d-zg86t"] Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.737470 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ded07897-9b9c-4548-a909-02c623167912-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-rkbf2\" (UID: \"ded07897-9b9c-4548-a909-02c623167912\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.738438 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz"] Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.741369 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ded07897-9b9c-4548-a909-02c623167912-tls-key-pair\") pod \"nmstate-webhook-6cdbc54649-rkbf2\" (UID: \"ded07897-9b9c-4548-a909-02c623167912\") " pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" Oct 13 18:25:25 crc kubenswrapper[4974]: W1013 18:25:25.745374 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d9c8026_13ca_4df7_8bfc_d36594573e26.slice/crio-8abcdc8c9f43fc6942156f3a8482f5477d6dd2cceeac523fbe7be3fffc58a2d7 WatchSource:0}: Error finding container 8abcdc8c9f43fc6942156f3a8482f5477d6dd2cceeac523fbe7be3fffc58a2d7: Status 404 returned error can't find the container with id 8abcdc8c9f43fc6942156f3a8482f5477d6dd2cceeac523fbe7be3fffc58a2d7 Oct 13 18:25:25 crc kubenswrapper[4974]: I1013 18:25:25.833515 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:26 crc kubenswrapper[4974]: I1013 18:25:26.026420 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" Oct 13 18:25:26 crc kubenswrapper[4974]: I1013 18:25:26.043959 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6fbd896c7d-9lqdg"] Oct 13 18:25:26 crc kubenswrapper[4974]: W1013 18:25:26.053520 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42748cf8_71e7_43df_a63d_eeebba063461.slice/crio-f607f3b357050bba1d2a127fda6b61be3a18c4c685e9b0b1b3a3771c28753d61 WatchSource:0}: Error finding container f607f3b357050bba1d2a127fda6b61be3a18c4c685e9b0b1b3a3771c28753d61: Status 404 returned error can't find the container with id f607f3b357050bba1d2a127fda6b61be3a18c4c685e9b0b1b3a3771c28753d61 Oct 13 18:25:26 crc kubenswrapper[4974]: I1013 18:25:26.113376 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-skb5t" event={"ID":"51e7a340-7c32-4fae-b22f-2dd321f0afc1","Type":"ContainerStarted","Data":"56907f9697da11e3634e70dd5ecc7821b86c8aebeb82e266b004986ff7f93f64"} Oct 13 18:25:26 crc kubenswrapper[4974]: I1013 18:25:26.114453 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-zg86t" event={"ID":"16f35fcb-c349-4306-a4ee-306dfff9a8f1","Type":"ContainerStarted","Data":"8ea645f491c4c7b20425a71b7c763bbc7798ab9b45e434f4f61a096f200a1363"} Oct 13 18:25:26 crc kubenswrapper[4974]: I1013 18:25:26.117200 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" event={"ID":"4d9c8026-13ca-4df7-8bfc-d36594573e26","Type":"ContainerStarted","Data":"8abcdc8c9f43fc6942156f3a8482f5477d6dd2cceeac523fbe7be3fffc58a2d7"} Oct 13 18:25:26 crc kubenswrapper[4974]: I1013 18:25:26.129926 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fbd896c7d-9lqdg" event={"ID":"42748cf8-71e7-43df-a63d-eeebba063461","Type":"ContainerStarted","Data":"f607f3b357050bba1d2a127fda6b61be3a18c4c685e9b0b1b3a3771c28753d61"} Oct 13 18:25:26 crc kubenswrapper[4974]: I1013 18:25:26.240218 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2"] Oct 13 18:25:26 crc kubenswrapper[4974]: W1013 18:25:26.256363 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podded07897_9b9c_4548_a909_02c623167912.slice/crio-d86b200c11f5689b08f91f096bcb8437dc287aa326933b7a70fdb81f8ac3e711 WatchSource:0}: Error finding container d86b200c11f5689b08f91f096bcb8437dc287aa326933b7a70fdb81f8ac3e711: Status 404 returned error can't find the container with id d86b200c11f5689b08f91f096bcb8437dc287aa326933b7a70fdb81f8ac3e711 Oct 13 18:25:27 crc kubenswrapper[4974]: I1013 18:25:27.136292 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6fbd896c7d-9lqdg" event={"ID":"42748cf8-71e7-43df-a63d-eeebba063461","Type":"ContainerStarted","Data":"d70c1a519cef7d064befc57149d9eb74f207511f48f458f71c8f6eed8ac4bb40"} Oct 13 18:25:27 crc kubenswrapper[4974]: I1013 18:25:27.137421 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" event={"ID":"ded07897-9b9c-4548-a909-02c623167912","Type":"ContainerStarted","Data":"d86b200c11f5689b08f91f096bcb8437dc287aa326933b7a70fdb81f8ac3e711"} Oct 13 18:25:27 crc kubenswrapper[4974]: I1013 18:25:27.151495 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6fbd896c7d-9lqdg" podStartSLOduration=2.15147912 podStartE2EDuration="2.15147912s" podCreationTimestamp="2025-10-13 18:25:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:25:27.150136421 +0000 UTC m=+662.054502521" watchObservedRunningTime="2025-10-13 18:25:27.15147912 +0000 UTC m=+662.055845200" Oct 13 18:25:29 crc kubenswrapper[4974]: I1013 18:25:29.151028 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-skb5t" event={"ID":"51e7a340-7c32-4fae-b22f-2dd321f0afc1","Type":"ContainerStarted","Data":"13cafeb77b77b018ee01198935d577c4bea0510c89d36f8245c4cabbc0449b86"} Oct 13 18:25:29 crc kubenswrapper[4974]: I1013 18:25:29.151821 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:29 crc kubenswrapper[4974]: I1013 18:25:29.153980 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-zg86t" event={"ID":"16f35fcb-c349-4306-a4ee-306dfff9a8f1","Type":"ContainerStarted","Data":"ce76f0ab5e7b2f9d5e86052352f26552350d8871a735715e9261b68f3b0d253f"} Oct 13 18:25:29 crc kubenswrapper[4974]: I1013 18:25:29.155888 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" event={"ID":"ded07897-9b9c-4548-a909-02c623167912","Type":"ContainerStarted","Data":"0c9ccc797f110cffe405b9f70a7156c3159fd3335c5efbd5241e1f7c460d2b8c"} Oct 13 18:25:29 crc kubenswrapper[4974]: I1013 18:25:29.156025 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" Oct 13 18:25:29 crc kubenswrapper[4974]: I1013 18:25:29.178096 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-skb5t" podStartSLOduration=1.606374159 podStartE2EDuration="4.17807654s" podCreationTimestamp="2025-10-13 18:25:25 +0000 UTC" firstStartedPulling="2025-10-13 18:25:25.477785291 +0000 UTC m=+660.382151371" lastFinishedPulling="2025-10-13 18:25:28.049487632 +0000 UTC m=+662.953853752" observedRunningTime="2025-10-13 18:25:29.175330271 +0000 UTC m=+664.079696341" watchObservedRunningTime="2025-10-13 18:25:29.17807654 +0000 UTC m=+664.082442620" Oct 13 18:25:30 crc kubenswrapper[4974]: I1013 18:25:30.166609 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" event={"ID":"4d9c8026-13ca-4df7-8bfc-d36594573e26","Type":"ContainerStarted","Data":"09463b26f3b006ab589883268efcc8ac59284ef9be8d676b0c48e03f093e8086"} Oct 13 18:25:30 crc kubenswrapper[4974]: I1013 18:25:30.186960 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-6b874cbd85-rx8hz" podStartSLOduration=1.849053574 podStartE2EDuration="5.186937858s" podCreationTimestamp="2025-10-13 18:25:25 +0000 UTC" firstStartedPulling="2025-10-13 18:25:25.747959588 +0000 UTC m=+660.652325668" lastFinishedPulling="2025-10-13 18:25:29.085843862 +0000 UTC m=+663.990209952" observedRunningTime="2025-10-13 18:25:30.186063243 +0000 UTC m=+665.090429343" watchObservedRunningTime="2025-10-13 18:25:30.186937858 +0000 UTC m=+665.091303938" Oct 13 18:25:30 crc kubenswrapper[4974]: I1013 18:25:30.191924 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" podStartSLOduration=3.401960602 podStartE2EDuration="5.191906941s" podCreationTimestamp="2025-10-13 18:25:25 +0000 UTC" firstStartedPulling="2025-10-13 18:25:26.259122561 +0000 UTC m=+661.163488641" lastFinishedPulling="2025-10-13 18:25:28.04906888 +0000 UTC m=+662.953434980" observedRunningTime="2025-10-13 18:25:29.19991872 +0000 UTC m=+664.104284800" watchObservedRunningTime="2025-10-13 18:25:30.191906941 +0000 UTC m=+665.096273021" Oct 13 18:25:31 crc kubenswrapper[4974]: I1013 18:25:31.174975 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-zg86t" event={"ID":"16f35fcb-c349-4306-a4ee-306dfff9a8f1","Type":"ContainerStarted","Data":"530a68292085ebcbea6e4e189dccd84629831443e8a91773b068c6097662d8bb"} Oct 13 18:25:31 crc kubenswrapper[4974]: I1013 18:25:31.202540 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-fdff9cb8d-zg86t" podStartSLOduration=1.304860509 podStartE2EDuration="6.202517067s" podCreationTimestamp="2025-10-13 18:25:25 +0000 UTC" firstStartedPulling="2025-10-13 18:25:25.68311954 +0000 UTC m=+660.587485620" lastFinishedPulling="2025-10-13 18:25:30.580776098 +0000 UTC m=+665.485142178" observedRunningTime="2025-10-13 18:25:31.197435801 +0000 UTC m=+666.101801891" watchObservedRunningTime="2025-10-13 18:25:31.202517067 +0000 UTC m=+666.106883157" Oct 13 18:25:35 crc kubenswrapper[4974]: I1013 18:25:35.492911 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-skb5t" Oct 13 18:25:35 crc kubenswrapper[4974]: I1013 18:25:35.835883 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:35 crc kubenswrapper[4974]: I1013 18:25:35.836952 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:35 crc kubenswrapper[4974]: I1013 18:25:35.843262 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:36 crc kubenswrapper[4974]: I1013 18:25:36.215343 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6fbd896c7d-9lqdg" Oct 13 18:25:36 crc kubenswrapper[4974]: I1013 18:25:36.290407 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jbznq"] Oct 13 18:25:46 crc kubenswrapper[4974]: I1013 18:25:46.036945 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6cdbc54649-rkbf2" Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.360172 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-jbznq" podUID="ce0c606d-4062-4f6a-afec-752440b5580c" containerName="console" containerID="cri-o://f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494" gracePeriod=15 Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.816959 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jbznq_ce0c606d-4062-4f6a-afec-752440b5580c/console/0.log" Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.817478 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.909584 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-trusted-ca-bundle\") pod \"ce0c606d-4062-4f6a-afec-752440b5580c\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.909636 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-console-config\") pod \"ce0c606d-4062-4f6a-afec-752440b5580c\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.909715 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-oauth-serving-cert\") pod \"ce0c606d-4062-4f6a-afec-752440b5580c\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.909750 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlxns\" (UniqueName: \"kubernetes.io/projected/ce0c606d-4062-4f6a-afec-752440b5580c-kube-api-access-wlxns\") pod \"ce0c606d-4062-4f6a-afec-752440b5580c\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.909804 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-service-ca\") pod \"ce0c606d-4062-4f6a-afec-752440b5580c\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.909835 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce0c606d-4062-4f6a-afec-752440b5580c-console-oauth-config\") pod \"ce0c606d-4062-4f6a-afec-752440b5580c\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.909923 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce0c606d-4062-4f6a-afec-752440b5580c-console-serving-cert\") pod \"ce0c606d-4062-4f6a-afec-752440b5580c\" (UID: \"ce0c606d-4062-4f6a-afec-752440b5580c\") " Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.910917 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-service-ca" (OuterVolumeSpecName: "service-ca") pod "ce0c606d-4062-4f6a-afec-752440b5580c" (UID: "ce0c606d-4062-4f6a-afec-752440b5580c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.910942 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ce0c606d-4062-4f6a-afec-752440b5580c" (UID: "ce0c606d-4062-4f6a-afec-752440b5580c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.910999 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ce0c606d-4062-4f6a-afec-752440b5580c" (UID: "ce0c606d-4062-4f6a-afec-752440b5580c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.911513 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-console-config" (OuterVolumeSpecName: "console-config") pod "ce0c606d-4062-4f6a-afec-752440b5580c" (UID: "ce0c606d-4062-4f6a-afec-752440b5580c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.918581 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0c606d-4062-4f6a-afec-752440b5580c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ce0c606d-4062-4f6a-afec-752440b5580c" (UID: "ce0c606d-4062-4f6a-afec-752440b5580c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.922142 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce0c606d-4062-4f6a-afec-752440b5580c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ce0c606d-4062-4f6a-afec-752440b5580c" (UID: "ce0c606d-4062-4f6a-afec-752440b5580c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:26:01 crc kubenswrapper[4974]: I1013 18:26:01.922247 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce0c606d-4062-4f6a-afec-752440b5580c-kube-api-access-wlxns" (OuterVolumeSpecName: "kube-api-access-wlxns") pod "ce0c606d-4062-4f6a-afec-752440b5580c" (UID: "ce0c606d-4062-4f6a-afec-752440b5580c"). InnerVolumeSpecName "kube-api-access-wlxns". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.011736 4974 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.011771 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlxns\" (UniqueName: \"kubernetes.io/projected/ce0c606d-4062-4f6a-afec-752440b5580c-kube-api-access-wlxns\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.011784 4974 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-service-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.011792 4974 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce0c606d-4062-4f6a-afec-752440b5580c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.011801 4974 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce0c606d-4062-4f6a-afec-752440b5580c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.011809 4974 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.011820 4974 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce0c606d-4062-4f6a-afec-752440b5580c-console-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.446086 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-jbznq_ce0c606d-4062-4f6a-afec-752440b5580c/console/0.log" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.446143 4974 generic.go:334] "Generic (PLEG): container finished" podID="ce0c606d-4062-4f6a-afec-752440b5580c" containerID="f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494" exitCode=2 Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.446177 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jbznq" event={"ID":"ce0c606d-4062-4f6a-afec-752440b5580c","Type":"ContainerDied","Data":"f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494"} Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.446205 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-jbznq" event={"ID":"ce0c606d-4062-4f6a-afec-752440b5580c","Type":"ContainerDied","Data":"d61528ed40ea24156af1fd58000ad279a52a6e8611f08b4ed67b2792a3c1c103"} Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.446222 4974 scope.go:117] "RemoveContainer" containerID="f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.446248 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-jbznq" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.472542 4974 scope.go:117] "RemoveContainer" containerID="f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494" Oct 13 18:26:02 crc kubenswrapper[4974]: E1013 18:26:02.473097 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494\": container with ID starting with f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494 not found: ID does not exist" containerID="f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.473146 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494"} err="failed to get container status \"f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494\": rpc error: code = NotFound desc = could not find container \"f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494\": container with ID starting with f5bc907582227ea15df4b0f4685c6709b09369366d84cee610bd87a8bec30494 not found: ID does not exist" Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.495082 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-jbznq"] Oct 13 18:26:02 crc kubenswrapper[4974]: I1013 18:26:02.500958 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-jbznq"] Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.646515 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk"] Oct 13 18:26:03 crc kubenswrapper[4974]: E1013 18:26:03.647066 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce0c606d-4062-4f6a-afec-752440b5580c" containerName="console" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.647082 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce0c606d-4062-4f6a-afec-752440b5580c" containerName="console" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.647243 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce0c606d-4062-4f6a-afec-752440b5580c" containerName="console" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.648205 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.651254 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.666872 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk"] Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.734283 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7c7g\" (UniqueName: \"kubernetes.io/projected/43f2dc64-d1da-4f5e-b7e6-f343259f5784-kube-api-access-m7c7g\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk\" (UID: \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.734566 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43f2dc64-d1da-4f5e-b7e6-f343259f5784-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk\" (UID: \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.734718 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43f2dc64-d1da-4f5e-b7e6-f343259f5784-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk\" (UID: \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.821367 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce0c606d-4062-4f6a-afec-752440b5580c" path="/var/lib/kubelet/pods/ce0c606d-4062-4f6a-afec-752440b5580c/volumes" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.835454 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7c7g\" (UniqueName: \"kubernetes.io/projected/43f2dc64-d1da-4f5e-b7e6-f343259f5784-kube-api-access-m7c7g\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk\" (UID: \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.835512 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43f2dc64-d1da-4f5e-b7e6-f343259f5784-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk\" (UID: \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.835578 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43f2dc64-d1da-4f5e-b7e6-f343259f5784-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk\" (UID: \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.836123 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43f2dc64-d1da-4f5e-b7e6-f343259f5784-util\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk\" (UID: \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.836211 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43f2dc64-d1da-4f5e-b7e6-f343259f5784-bundle\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk\" (UID: \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.868754 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7c7g\" (UniqueName: \"kubernetes.io/projected/43f2dc64-d1da-4f5e-b7e6-f343259f5784-kube-api-access-m7c7g\") pod \"8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk\" (UID: \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\") " pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:03 crc kubenswrapper[4974]: I1013 18:26:03.964356 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:04 crc kubenswrapper[4974]: I1013 18:26:04.274297 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk"] Oct 13 18:26:04 crc kubenswrapper[4974]: I1013 18:26:04.473845 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" event={"ID":"43f2dc64-d1da-4f5e-b7e6-f343259f5784","Type":"ContainerStarted","Data":"b85ace5e1666c7b1d890b66abf13e8bbf4c39c9dde31ab3554d2757329876c38"} Oct 13 18:26:04 crc kubenswrapper[4974]: I1013 18:26:04.473900 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" event={"ID":"43f2dc64-d1da-4f5e-b7e6-f343259f5784","Type":"ContainerStarted","Data":"06affbfe49ef8c98cf307587e4b693ccf1d1990c558f2b58b1ddd3e338a96b73"} Oct 13 18:26:05 crc kubenswrapper[4974]: I1013 18:26:05.483290 4974 generic.go:334] "Generic (PLEG): container finished" podID="43f2dc64-d1da-4f5e-b7e6-f343259f5784" containerID="b85ace5e1666c7b1d890b66abf13e8bbf4c39c9dde31ab3554d2757329876c38" exitCode=0 Oct 13 18:26:05 crc kubenswrapper[4974]: I1013 18:26:05.483365 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" event={"ID":"43f2dc64-d1da-4f5e-b7e6-f343259f5784","Type":"ContainerDied","Data":"b85ace5e1666c7b1d890b66abf13e8bbf4c39c9dde31ab3554d2757329876c38"} Oct 13 18:26:07 crc kubenswrapper[4974]: I1013 18:26:07.504994 4974 generic.go:334] "Generic (PLEG): container finished" podID="43f2dc64-d1da-4f5e-b7e6-f343259f5784" containerID="a8b33e027686b6c19b9cd9048e5d0ba8ae8278ab0817df4620f42afc18970006" exitCode=0 Oct 13 18:26:07 crc kubenswrapper[4974]: I1013 18:26:07.505076 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" event={"ID":"43f2dc64-d1da-4f5e-b7e6-f343259f5784","Type":"ContainerDied","Data":"a8b33e027686b6c19b9cd9048e5d0ba8ae8278ab0817df4620f42afc18970006"} Oct 13 18:26:08 crc kubenswrapper[4974]: I1013 18:26:08.517180 4974 generic.go:334] "Generic (PLEG): container finished" podID="43f2dc64-d1da-4f5e-b7e6-f343259f5784" containerID="c63d790f1acfbd7673e820c08c5179c8d2a8dae3aaea6dd69913acf44c465d7a" exitCode=0 Oct 13 18:26:08 crc kubenswrapper[4974]: I1013 18:26:08.517255 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" event={"ID":"43f2dc64-d1da-4f5e-b7e6-f343259f5784","Type":"ContainerDied","Data":"c63d790f1acfbd7673e820c08c5179c8d2a8dae3aaea6dd69913acf44c465d7a"} Oct 13 18:26:09 crc kubenswrapper[4974]: I1013 18:26:09.794853 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:09 crc kubenswrapper[4974]: I1013 18:26:09.928303 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7c7g\" (UniqueName: \"kubernetes.io/projected/43f2dc64-d1da-4f5e-b7e6-f343259f5784-kube-api-access-m7c7g\") pod \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\" (UID: \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\") " Oct 13 18:26:09 crc kubenswrapper[4974]: I1013 18:26:09.928396 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43f2dc64-d1da-4f5e-b7e6-f343259f5784-util\") pod \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\" (UID: \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\") " Oct 13 18:26:09 crc kubenswrapper[4974]: I1013 18:26:09.928462 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43f2dc64-d1da-4f5e-b7e6-f343259f5784-bundle\") pod \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\" (UID: \"43f2dc64-d1da-4f5e-b7e6-f343259f5784\") " Oct 13 18:26:09 crc kubenswrapper[4974]: I1013 18:26:09.929409 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f2dc64-d1da-4f5e-b7e6-f343259f5784-bundle" (OuterVolumeSpecName: "bundle") pod "43f2dc64-d1da-4f5e-b7e6-f343259f5784" (UID: "43f2dc64-d1da-4f5e-b7e6-f343259f5784"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:26:09 crc kubenswrapper[4974]: I1013 18:26:09.937856 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43f2dc64-d1da-4f5e-b7e6-f343259f5784-kube-api-access-m7c7g" (OuterVolumeSpecName: "kube-api-access-m7c7g") pod "43f2dc64-d1da-4f5e-b7e6-f343259f5784" (UID: "43f2dc64-d1da-4f5e-b7e6-f343259f5784"). InnerVolumeSpecName "kube-api-access-m7c7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:26:10 crc kubenswrapper[4974]: I1013 18:26:10.030171 4974 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43f2dc64-d1da-4f5e-b7e6-f343259f5784-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:10 crc kubenswrapper[4974]: I1013 18:26:10.030215 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m7c7g\" (UniqueName: \"kubernetes.io/projected/43f2dc64-d1da-4f5e-b7e6-f343259f5784-kube-api-access-m7c7g\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:10 crc kubenswrapper[4974]: I1013 18:26:10.165979 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43f2dc64-d1da-4f5e-b7e6-f343259f5784-util" (OuterVolumeSpecName: "util") pod "43f2dc64-d1da-4f5e-b7e6-f343259f5784" (UID: "43f2dc64-d1da-4f5e-b7e6-f343259f5784"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:26:10 crc kubenswrapper[4974]: I1013 18:26:10.233833 4974 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43f2dc64-d1da-4f5e-b7e6-f343259f5784-util\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:10 crc kubenswrapper[4974]: I1013 18:26:10.537226 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" event={"ID":"43f2dc64-d1da-4f5e-b7e6-f343259f5784","Type":"ContainerDied","Data":"06affbfe49ef8c98cf307587e4b693ccf1d1990c558f2b58b1ddd3e338a96b73"} Oct 13 18:26:10 crc kubenswrapper[4974]: I1013 18:26:10.537288 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06affbfe49ef8c98cf307587e4b693ccf1d1990c558f2b58b1ddd3e338a96b73" Oct 13 18:26:10 crc kubenswrapper[4974]: I1013 18:26:10.537373 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.277636 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv"] Oct 13 18:26:23 crc kubenswrapper[4974]: E1013 18:26:23.278649 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f2dc64-d1da-4f5e-b7e6-f343259f5784" containerName="extract" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.278696 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f2dc64-d1da-4f5e-b7e6-f343259f5784" containerName="extract" Oct 13 18:26:23 crc kubenswrapper[4974]: E1013 18:26:23.278737 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f2dc64-d1da-4f5e-b7e6-f343259f5784" containerName="util" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.278750 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f2dc64-d1da-4f5e-b7e6-f343259f5784" containerName="util" Oct 13 18:26:23 crc kubenswrapper[4974]: E1013 18:26:23.278765 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43f2dc64-d1da-4f5e-b7e6-f343259f5784" containerName="pull" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.278777 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="43f2dc64-d1da-4f5e-b7e6-f343259f5784" containerName="pull" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.278944 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="43f2dc64-d1da-4f5e-b7e6-f343259f5784" containerName="extract" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.279607 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.292227 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.292962 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.293193 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.296899 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-q84v9" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.309887 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.310546 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b51360f9-c2df-4940-8a9f-91bd9287605c-webhook-cert\") pod \"metallb-operator-controller-manager-fd8c579fc-kgnkv\" (UID: \"b51360f9-c2df-4940-8a9f-91bd9287605c\") " pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.310595 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtg8r\" (UniqueName: \"kubernetes.io/projected/b51360f9-c2df-4940-8a9f-91bd9287605c-kube-api-access-gtg8r\") pod \"metallb-operator-controller-manager-fd8c579fc-kgnkv\" (UID: \"b51360f9-c2df-4940-8a9f-91bd9287605c\") " pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.310629 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b51360f9-c2df-4940-8a9f-91bd9287605c-apiservice-cert\") pod \"metallb-operator-controller-manager-fd8c579fc-kgnkv\" (UID: \"b51360f9-c2df-4940-8a9f-91bd9287605c\") " pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.317505 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv"] Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.413420 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b51360f9-c2df-4940-8a9f-91bd9287605c-webhook-cert\") pod \"metallb-operator-controller-manager-fd8c579fc-kgnkv\" (UID: \"b51360f9-c2df-4940-8a9f-91bd9287605c\") " pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.413468 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b51360f9-c2df-4940-8a9f-91bd9287605c-apiservice-cert\") pod \"metallb-operator-controller-manager-fd8c579fc-kgnkv\" (UID: \"b51360f9-c2df-4940-8a9f-91bd9287605c\") " pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.413489 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtg8r\" (UniqueName: \"kubernetes.io/projected/b51360f9-c2df-4940-8a9f-91bd9287605c-kube-api-access-gtg8r\") pod \"metallb-operator-controller-manager-fd8c579fc-kgnkv\" (UID: \"b51360f9-c2df-4940-8a9f-91bd9287605c\") " pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.421311 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b51360f9-c2df-4940-8a9f-91bd9287605c-webhook-cert\") pod \"metallb-operator-controller-manager-fd8c579fc-kgnkv\" (UID: \"b51360f9-c2df-4940-8a9f-91bd9287605c\") " pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.434369 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtg8r\" (UniqueName: \"kubernetes.io/projected/b51360f9-c2df-4940-8a9f-91bd9287605c-kube-api-access-gtg8r\") pod \"metallb-operator-controller-manager-fd8c579fc-kgnkv\" (UID: \"b51360f9-c2df-4940-8a9f-91bd9287605c\") " pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.435282 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b51360f9-c2df-4940-8a9f-91bd9287605c-apiservice-cert\") pod \"metallb-operator-controller-manager-fd8c579fc-kgnkv\" (UID: \"b51360f9-c2df-4940-8a9f-91bd9287605c\") " pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.679690 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.712555 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q"] Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.713568 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.715995 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.716667 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9htxj\" (UniqueName: \"kubernetes.io/projected/ca62cae8-3dc3-492d-aa06-59d085da2253-kube-api-access-9htxj\") pod \"metallb-operator-webhook-server-9c5d5965c-l9f5q\" (UID: \"ca62cae8-3dc3-492d-aa06-59d085da2253\") " pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.716937 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca62cae8-3dc3-492d-aa06-59d085da2253-apiservice-cert\") pod \"metallb-operator-webhook-server-9c5d5965c-l9f5q\" (UID: \"ca62cae8-3dc3-492d-aa06-59d085da2253\") " pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.717138 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca62cae8-3dc3-492d-aa06-59d085da2253-webhook-cert\") pod \"metallb-operator-webhook-server-9c5d5965c-l9f5q\" (UID: \"ca62cae8-3dc3-492d-aa06-59d085da2253\") " pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.717998 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.721052 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-gmr9v" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.731738 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q"] Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.817984 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9htxj\" (UniqueName: \"kubernetes.io/projected/ca62cae8-3dc3-492d-aa06-59d085da2253-kube-api-access-9htxj\") pod \"metallb-operator-webhook-server-9c5d5965c-l9f5q\" (UID: \"ca62cae8-3dc3-492d-aa06-59d085da2253\") " pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.818062 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca62cae8-3dc3-492d-aa06-59d085da2253-apiservice-cert\") pod \"metallb-operator-webhook-server-9c5d5965c-l9f5q\" (UID: \"ca62cae8-3dc3-492d-aa06-59d085da2253\") " pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.818106 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca62cae8-3dc3-492d-aa06-59d085da2253-webhook-cert\") pod \"metallb-operator-webhook-server-9c5d5965c-l9f5q\" (UID: \"ca62cae8-3dc3-492d-aa06-59d085da2253\") " pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.822357 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca62cae8-3dc3-492d-aa06-59d085da2253-webhook-cert\") pod \"metallb-operator-webhook-server-9c5d5965c-l9f5q\" (UID: \"ca62cae8-3dc3-492d-aa06-59d085da2253\") " pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.835024 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9htxj\" (UniqueName: \"kubernetes.io/projected/ca62cae8-3dc3-492d-aa06-59d085da2253-kube-api-access-9htxj\") pod \"metallb-operator-webhook-server-9c5d5965c-l9f5q\" (UID: \"ca62cae8-3dc3-492d-aa06-59d085da2253\") " pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:23 crc kubenswrapper[4974]: I1013 18:26:23.837077 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca62cae8-3dc3-492d-aa06-59d085da2253-apiservice-cert\") pod \"metallb-operator-webhook-server-9c5d5965c-l9f5q\" (UID: \"ca62cae8-3dc3-492d-aa06-59d085da2253\") " pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:24 crc kubenswrapper[4974]: I1013 18:26:24.047314 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:24 crc kubenswrapper[4974]: I1013 18:26:24.135574 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv"] Oct 13 18:26:24 crc kubenswrapper[4974]: I1013 18:26:24.520841 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q"] Oct 13 18:26:24 crc kubenswrapper[4974]: W1013 18:26:24.529836 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca62cae8_3dc3_492d_aa06_59d085da2253.slice/crio-c31dfcb3b284b138f87347a1a264e878782c56baafdd4c416985f4c40c647968 WatchSource:0}: Error finding container c31dfcb3b284b138f87347a1a264e878782c56baafdd4c416985f4c40c647968: Status 404 returned error can't find the container with id c31dfcb3b284b138f87347a1a264e878782c56baafdd4c416985f4c40c647968 Oct 13 18:26:24 crc kubenswrapper[4974]: I1013 18:26:24.647912 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" event={"ID":"b51360f9-c2df-4940-8a9f-91bd9287605c","Type":"ContainerStarted","Data":"a673c0493233e06993b3b91193eeaa1b6ceaa097cba5d770d6d59e561d5a418c"} Oct 13 18:26:24 crc kubenswrapper[4974]: I1013 18:26:24.649445 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" event={"ID":"ca62cae8-3dc3-492d-aa06-59d085da2253","Type":"ContainerStarted","Data":"c31dfcb3b284b138f87347a1a264e878782c56baafdd4c416985f4c40c647968"} Oct 13 18:26:31 crc kubenswrapper[4974]: I1013 18:26:31.717254 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" event={"ID":"b51360f9-c2df-4940-8a9f-91bd9287605c","Type":"ContainerStarted","Data":"1aa5c6622dea647c6366b88cd8e6fc64b700d6707a17b5b276ef51c153e88b7f"} Oct 13 18:26:31 crc kubenswrapper[4974]: I1013 18:26:31.718093 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:26:31 crc kubenswrapper[4974]: I1013 18:26:31.719735 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" event={"ID":"ca62cae8-3dc3-492d-aa06-59d085da2253","Type":"ContainerStarted","Data":"09d49e14d9ae11a60333288a03cb782b2663e63c5b9c1577b2601b91106172ca"} Oct 13 18:26:31 crc kubenswrapper[4974]: I1013 18:26:31.719898 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:31 crc kubenswrapper[4974]: I1013 18:26:31.750341 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" podStartSLOduration=2.4457881439999998 podStartE2EDuration="8.750306972s" podCreationTimestamp="2025-10-13 18:26:23 +0000 UTC" firstStartedPulling="2025-10-13 18:26:24.146565834 +0000 UTC m=+719.050931924" lastFinishedPulling="2025-10-13 18:26:30.451084632 +0000 UTC m=+725.355450752" observedRunningTime="2025-10-13 18:26:31.743350424 +0000 UTC m=+726.647716514" watchObservedRunningTime="2025-10-13 18:26:31.750306972 +0000 UTC m=+726.654673122" Oct 13 18:26:31 crc kubenswrapper[4974]: I1013 18:26:31.779248 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" podStartSLOduration=2.75929751 podStartE2EDuration="8.779227076s" podCreationTimestamp="2025-10-13 18:26:23 +0000 UTC" firstStartedPulling="2025-10-13 18:26:24.533239733 +0000 UTC m=+719.437605813" lastFinishedPulling="2025-10-13 18:26:30.553169289 +0000 UTC m=+725.457535379" observedRunningTime="2025-10-13 18:26:31.777636 +0000 UTC m=+726.682002080" watchObservedRunningTime="2025-10-13 18:26:31.779227076 +0000 UTC m=+726.683593156" Oct 13 18:26:44 crc kubenswrapper[4974]: I1013 18:26:44.054689 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-9c5d5965c-l9f5q" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.399946 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2qjrb"] Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.400685 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" podUID="f4dff75c-23a2-40e2-9259-e056682367d7" containerName="controller-manager" containerID="cri-o://dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160" gracePeriod=30 Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.413105 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t"] Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.413407 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" podUID="76842663-4197-4c71-8601-6a657814388b" containerName="route-controller-manager" containerID="cri-o://dbb989c50e965a36408c384d0e26c3a7a0f1425d5c5c34894fa15048ad73f0f1" gracePeriod=30 Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.833035 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.882248 4974 generic.go:334] "Generic (PLEG): container finished" podID="f4dff75c-23a2-40e2-9259-e056682367d7" containerID="dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160" exitCode=0 Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.882329 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" event={"ID":"f4dff75c-23a2-40e2-9259-e056682367d7","Type":"ContainerDied","Data":"dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160"} Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.882364 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" event={"ID":"f4dff75c-23a2-40e2-9259-e056682367d7","Type":"ContainerDied","Data":"f9c0968f052effaeb17178c794569f72fb74c52a1c7c264c795f0081566d118a"} Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.882385 4974 scope.go:117] "RemoveContainer" containerID="dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.882506 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-2qjrb" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.886304 4974 generic.go:334] "Generic (PLEG): container finished" podID="76842663-4197-4c71-8601-6a657814388b" containerID="dbb989c50e965a36408c384d0e26c3a7a0f1425d5c5c34894fa15048ad73f0f1" exitCode=0 Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.886351 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" event={"ID":"76842663-4197-4c71-8601-6a657814388b","Type":"ContainerDied","Data":"dbb989c50e965a36408c384d0e26c3a7a0f1425d5c5c34894fa15048ad73f0f1"} Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.886378 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" event={"ID":"76842663-4197-4c71-8601-6a657814388b","Type":"ContainerDied","Data":"71760778a44f71049938f941216309eccb230aa231c48a0a8748e7d9595a6bf2"} Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.886391 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71760778a44f71049938f941216309eccb230aa231c48a0a8748e7d9595a6bf2" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.898226 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.908041 4974 scope.go:117] "RemoveContainer" containerID="dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160" Oct 13 18:26:51 crc kubenswrapper[4974]: E1013 18:26:51.909280 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160\": container with ID starting with dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160 not found: ID does not exist" containerID="dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.909321 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160"} err="failed to get container status \"dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160\": rpc error: code = NotFound desc = could not find container \"dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160\": container with ID starting with dc73cabb2961dd539e5da006acd82f92d2e497ac190950a0db260d4dceeda160 not found: ID does not exist" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.943999 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pdlf\" (UniqueName: \"kubernetes.io/projected/f4dff75c-23a2-40e2-9259-e056682367d7-kube-api-access-2pdlf\") pod \"f4dff75c-23a2-40e2-9259-e056682367d7\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.944403 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-client-ca\") pod \"f4dff75c-23a2-40e2-9259-e056682367d7\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.944451 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4dff75c-23a2-40e2-9259-e056682367d7-serving-cert\") pod \"f4dff75c-23a2-40e2-9259-e056682367d7\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.944476 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76842663-4197-4c71-8601-6a657814388b-client-ca\") pod \"76842663-4197-4c71-8601-6a657814388b\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.944508 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-proxy-ca-bundles\") pod \"f4dff75c-23a2-40e2-9259-e056682367d7\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.944558 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-config\") pod \"f4dff75c-23a2-40e2-9259-e056682367d7\" (UID: \"f4dff75c-23a2-40e2-9259-e056682367d7\") " Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.944632 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9lr4\" (UniqueName: \"kubernetes.io/projected/76842663-4197-4c71-8601-6a657814388b-kube-api-access-q9lr4\") pod \"76842663-4197-4c71-8601-6a657814388b\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.945132 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f4dff75c-23a2-40e2-9259-e056682367d7" (UID: "f4dff75c-23a2-40e2-9259-e056682367d7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.945208 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76842663-4197-4c71-8601-6a657814388b-client-ca" (OuterVolumeSpecName: "client-ca") pod "76842663-4197-4c71-8601-6a657814388b" (UID: "76842663-4197-4c71-8601-6a657814388b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.945208 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-client-ca" (OuterVolumeSpecName: "client-ca") pod "f4dff75c-23a2-40e2-9259-e056682367d7" (UID: "f4dff75c-23a2-40e2-9259-e056682367d7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.945302 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-config" (OuterVolumeSpecName: "config") pod "f4dff75c-23a2-40e2-9259-e056682367d7" (UID: "f4dff75c-23a2-40e2-9259-e056682367d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.952147 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4dff75c-23a2-40e2-9259-e056682367d7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f4dff75c-23a2-40e2-9259-e056682367d7" (UID: "f4dff75c-23a2-40e2-9259-e056682367d7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.952853 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76842663-4197-4c71-8601-6a657814388b-kube-api-access-q9lr4" (OuterVolumeSpecName: "kube-api-access-q9lr4") pod "76842663-4197-4c71-8601-6a657814388b" (UID: "76842663-4197-4c71-8601-6a657814388b"). InnerVolumeSpecName "kube-api-access-q9lr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:26:51 crc kubenswrapper[4974]: I1013 18:26:51.953384 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4dff75c-23a2-40e2-9259-e056682367d7-kube-api-access-2pdlf" (OuterVolumeSpecName: "kube-api-access-2pdlf") pod "f4dff75c-23a2-40e2-9259-e056682367d7" (UID: "f4dff75c-23a2-40e2-9259-e056682367d7"). InnerVolumeSpecName "kube-api-access-2pdlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.045354 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76842663-4197-4c71-8601-6a657814388b-config\") pod \"76842663-4197-4c71-8601-6a657814388b\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.045431 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76842663-4197-4c71-8601-6a657814388b-serving-cert\") pod \"76842663-4197-4c71-8601-6a657814388b\" (UID: \"76842663-4197-4c71-8601-6a657814388b\") " Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.045753 4974 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-client-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.045772 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4dff75c-23a2-40e2-9259-e056682367d7-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.045781 4974 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76842663-4197-4c71-8601-6a657814388b-client-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.045790 4974 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.045801 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4dff75c-23a2-40e2-9259-e056682367d7-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.045810 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9lr4\" (UniqueName: \"kubernetes.io/projected/76842663-4197-4c71-8601-6a657814388b-kube-api-access-q9lr4\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.045819 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pdlf\" (UniqueName: \"kubernetes.io/projected/f4dff75c-23a2-40e2-9259-e056682367d7-kube-api-access-2pdlf\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.046071 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76842663-4197-4c71-8601-6a657814388b-config" (OuterVolumeSpecName: "config") pod "76842663-4197-4c71-8601-6a657814388b" (UID: "76842663-4197-4c71-8601-6a657814388b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.048879 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76842663-4197-4c71-8601-6a657814388b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "76842663-4197-4c71-8601-6a657814388b" (UID: "76842663-4197-4c71-8601-6a657814388b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.146572 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76842663-4197-4c71-8601-6a657814388b-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.146616 4974 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76842663-4197-4c71-8601-6a657814388b-serving-cert\") on node \"crc\" DevicePath \"\"" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.214914 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2qjrb"] Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.220924 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-2qjrb"] Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.895883 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t" Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.928522 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t"] Oct 13 18:26:52 crc kubenswrapper[4974]: I1013 18:26:52.934147 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-mqg2t"] Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.506078 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-686df59677-5n7lj"] Oct 13 18:26:53 crc kubenswrapper[4974]: E1013 18:26:53.506584 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4dff75c-23a2-40e2-9259-e056682367d7" containerName="controller-manager" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.506600 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4dff75c-23a2-40e2-9259-e056682367d7" containerName="controller-manager" Oct 13 18:26:53 crc kubenswrapper[4974]: E1013 18:26:53.506613 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76842663-4197-4c71-8601-6a657814388b" containerName="route-controller-manager" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.506620 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="76842663-4197-4c71-8601-6a657814388b" containerName="route-controller-manager" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.506733 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4dff75c-23a2-40e2-9259-e056682367d7" containerName="controller-manager" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.506746 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="76842663-4197-4c71-8601-6a657814388b" containerName="route-controller-manager" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.507154 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.509428 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl"] Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.510026 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.510168 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.513371 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.513396 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.513419 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.513412 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.514429 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.515064 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.515075 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.515446 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.515752 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.515783 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.516591 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.524169 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-686df59677-5n7lj"] Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.524941 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.561181 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl"] Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.566288 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqwlw\" (UniqueName: \"kubernetes.io/projected/b9ba4d56-ebd7-4530-bf75-1f4538bc3230-kube-api-access-nqwlw\") pod \"route-controller-manager-5fbbbd985d-sxddl\" (UID: \"b9ba4d56-ebd7-4530-bf75-1f4538bc3230\") " pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.566339 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9ba4d56-ebd7-4530-bf75-1f4538bc3230-serving-cert\") pod \"route-controller-manager-5fbbbd985d-sxddl\" (UID: \"b9ba4d56-ebd7-4530-bf75-1f4538bc3230\") " pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.566370 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-config\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.566389 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-serving-cert\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.566410 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9ba4d56-ebd7-4530-bf75-1f4538bc3230-client-ca\") pod \"route-controller-manager-5fbbbd985d-sxddl\" (UID: \"b9ba4d56-ebd7-4530-bf75-1f4538bc3230\") " pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.566443 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-client-ca\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.566462 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ba4d56-ebd7-4530-bf75-1f4538bc3230-config\") pod \"route-controller-manager-5fbbbd985d-sxddl\" (UID: \"b9ba4d56-ebd7-4530-bf75-1f4538bc3230\") " pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.566487 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-proxy-ca-bundles\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.566507 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wfhw\" (UniqueName: \"kubernetes.io/projected/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-kube-api-access-5wfhw\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.667407 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqwlw\" (UniqueName: \"kubernetes.io/projected/b9ba4d56-ebd7-4530-bf75-1f4538bc3230-kube-api-access-nqwlw\") pod \"route-controller-manager-5fbbbd985d-sxddl\" (UID: \"b9ba4d56-ebd7-4530-bf75-1f4538bc3230\") " pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.667478 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9ba4d56-ebd7-4530-bf75-1f4538bc3230-serving-cert\") pod \"route-controller-manager-5fbbbd985d-sxddl\" (UID: \"b9ba4d56-ebd7-4530-bf75-1f4538bc3230\") " pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.667512 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-config\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.667529 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-serving-cert\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.667547 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9ba4d56-ebd7-4530-bf75-1f4538bc3230-client-ca\") pod \"route-controller-manager-5fbbbd985d-sxddl\" (UID: \"b9ba4d56-ebd7-4530-bf75-1f4538bc3230\") " pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.667574 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-client-ca\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.667592 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ba4d56-ebd7-4530-bf75-1f4538bc3230-config\") pod \"route-controller-manager-5fbbbd985d-sxddl\" (UID: \"b9ba4d56-ebd7-4530-bf75-1f4538bc3230\") " pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.667616 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-proxy-ca-bundles\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.667631 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wfhw\" (UniqueName: \"kubernetes.io/projected/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-kube-api-access-5wfhw\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.668593 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b9ba4d56-ebd7-4530-bf75-1f4538bc3230-client-ca\") pod \"route-controller-manager-5fbbbd985d-sxddl\" (UID: \"b9ba4d56-ebd7-4530-bf75-1f4538bc3230\") " pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.668727 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-client-ca\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.669044 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-config\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.669058 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9ba4d56-ebd7-4530-bf75-1f4538bc3230-config\") pod \"route-controller-manager-5fbbbd985d-sxddl\" (UID: \"b9ba4d56-ebd7-4530-bf75-1f4538bc3230\") " pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.669260 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-proxy-ca-bundles\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.674224 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-serving-cert\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.679445 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b9ba4d56-ebd7-4530-bf75-1f4538bc3230-serving-cert\") pod \"route-controller-manager-5fbbbd985d-sxddl\" (UID: \"b9ba4d56-ebd7-4530-bf75-1f4538bc3230\") " pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.691791 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqwlw\" (UniqueName: \"kubernetes.io/projected/b9ba4d56-ebd7-4530-bf75-1f4538bc3230-kube-api-access-nqwlw\") pod \"route-controller-manager-5fbbbd985d-sxddl\" (UID: \"b9ba4d56-ebd7-4530-bf75-1f4538bc3230\") " pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.715708 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wfhw\" (UniqueName: \"kubernetes.io/projected/7a2afc07-b90f-4a29-94ef-180cf7fb0a6e-kube-api-access-5wfhw\") pod \"controller-manager-686df59677-5n7lj\" (UID: \"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e\") " pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.820896 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76842663-4197-4c71-8601-6a657814388b" path="/var/lib/kubelet/pods/76842663-4197-4c71-8601-6a657814388b/volumes" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.821502 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4dff75c-23a2-40e2-9259-e056682367d7" path="/var/lib/kubelet/pods/f4dff75c-23a2-40e2-9259-e056682367d7/volumes" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.824029 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:53 crc kubenswrapper[4974]: I1013 18:26:53.835022 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:54 crc kubenswrapper[4974]: I1013 18:26:54.091796 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl"] Oct 13 18:26:54 crc kubenswrapper[4974]: I1013 18:26:54.269771 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-686df59677-5n7lj"] Oct 13 18:26:54 crc kubenswrapper[4974]: W1013 18:26:54.280812 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a2afc07_b90f_4a29_94ef_180cf7fb0a6e.slice/crio-6dcaf365ef6f43408785ca95ccab47fe76744a29340bd12a7cc2f6b7bf81425d WatchSource:0}: Error finding container 6dcaf365ef6f43408785ca95ccab47fe76744a29340bd12a7cc2f6b7bf81425d: Status 404 returned error can't find the container with id 6dcaf365ef6f43408785ca95ccab47fe76744a29340bd12a7cc2f6b7bf81425d Oct 13 18:26:54 crc kubenswrapper[4974]: I1013 18:26:54.916210 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" event={"ID":"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e","Type":"ContainerStarted","Data":"45509fb5f574b677569f271eeaa1dd02e7ac60ccddc3627d9d2a48b40f896218"} Oct 13 18:26:54 crc kubenswrapper[4974]: I1013 18:26:54.916626 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" event={"ID":"7a2afc07-b90f-4a29-94ef-180cf7fb0a6e","Type":"ContainerStarted","Data":"6dcaf365ef6f43408785ca95ccab47fe76744a29340bd12a7cc2f6b7bf81425d"} Oct 13 18:26:54 crc kubenswrapper[4974]: I1013 18:26:54.916730 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:54 crc kubenswrapper[4974]: I1013 18:26:54.918050 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" event={"ID":"b9ba4d56-ebd7-4530-bf75-1f4538bc3230","Type":"ContainerStarted","Data":"006ea6c740c1d7434141fdc6287fdbe9a9ef6aec6f92bb6c45e5f47e67165396"} Oct 13 18:26:54 crc kubenswrapper[4974]: I1013 18:26:54.918096 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" event={"ID":"b9ba4d56-ebd7-4530-bf75-1f4538bc3230","Type":"ContainerStarted","Data":"d8a5811115490f4a95aaf32582f8275c5d7b64bb989364ec403acfbd6a8f64da"} Oct 13 18:26:54 crc kubenswrapper[4974]: I1013 18:26:54.918286 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:26:54 crc kubenswrapper[4974]: I1013 18:26:54.921882 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" Oct 13 18:26:54 crc kubenswrapper[4974]: I1013 18:26:54.936703 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-686df59677-5n7lj" podStartSLOduration=3.936624115 podStartE2EDuration="3.936624115s" podCreationTimestamp="2025-10-13 18:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:26:54.932323783 +0000 UTC m=+749.836689883" watchObservedRunningTime="2025-10-13 18:26:54.936624115 +0000 UTC m=+749.840990205" Oct 13 18:26:54 crc kubenswrapper[4974]: I1013 18:26:54.954008 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" podStartSLOduration=3.953990529 podStartE2EDuration="3.953990529s" podCreationTimestamp="2025-10-13 18:26:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:26:54.95296059 +0000 UTC m=+749.857326700" watchObservedRunningTime="2025-10-13 18:26:54.953990529 +0000 UTC m=+749.858356619" Oct 13 18:26:55 crc kubenswrapper[4974]: I1013 18:26:55.041207 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5fbbbd985d-sxddl" Oct 13 18:27:00 crc kubenswrapper[4974]: I1013 18:27:00.357253 4974 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Oct 13 18:27:03 crc kubenswrapper[4974]: I1013 18:27:03.688328 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-fd8c579fc-kgnkv" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.511921 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-w9rrg"] Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.514705 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.517134 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.517334 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-chnn7" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.518787 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.523447 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0842201f-d8df-4376-b130-4bc0c560dc37-reloader\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.523494 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv5tk\" (UniqueName: \"kubernetes.io/projected/0842201f-d8df-4376-b130-4bc0c560dc37-kube-api-access-tv5tk\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.523522 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0842201f-d8df-4376-b130-4bc0c560dc37-frr-startup\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.523561 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0842201f-d8df-4376-b130-4bc0c560dc37-metrics-certs\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.523626 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0842201f-d8df-4376-b130-4bc0c560dc37-metrics\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.523689 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0842201f-d8df-4376-b130-4bc0c560dc37-frr-sockets\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.523709 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0842201f-d8df-4376-b130-4bc0c560dc37-frr-conf\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.536687 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv"] Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.537576 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.542554 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.563147 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv"] Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.625139 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0842201f-d8df-4376-b130-4bc0c560dc37-frr-startup\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.625210 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0842201f-d8df-4376-b130-4bc0c560dc37-metrics-certs\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.625280 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqgrv\" (UniqueName: \"kubernetes.io/projected/b014985a-51e5-494a-a16b-c126e6fce6b3-kube-api-access-zqgrv\") pod \"frr-k8s-webhook-server-64bf5d555-t5brv\" (UID: \"b014985a-51e5-494a-a16b-c126e6fce6b3\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.625312 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0842201f-d8df-4376-b130-4bc0c560dc37-metrics\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.625351 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0842201f-d8df-4376-b130-4bc0c560dc37-frr-sockets\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.625372 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0842201f-d8df-4376-b130-4bc0c560dc37-frr-conf\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.625406 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0842201f-d8df-4376-b130-4bc0c560dc37-reloader\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.625438 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b014985a-51e5-494a-a16b-c126e6fce6b3-cert\") pod \"frr-k8s-webhook-server-64bf5d555-t5brv\" (UID: \"b014985a-51e5-494a-a16b-c126e6fce6b3\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.625461 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv5tk\" (UniqueName: \"kubernetes.io/projected/0842201f-d8df-4376-b130-4bc0c560dc37-kube-api-access-tv5tk\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.625817 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0842201f-d8df-4376-b130-4bc0c560dc37-metrics\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.625889 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0842201f-d8df-4376-b130-4bc0c560dc37-frr-sockets\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.626295 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0842201f-d8df-4376-b130-4bc0c560dc37-reloader\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.626348 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0842201f-d8df-4376-b130-4bc0c560dc37-frr-conf\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.627080 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0842201f-d8df-4376-b130-4bc0c560dc37-frr-startup\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.629107 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-lc487"] Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.630241 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lc487" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.632332 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-jrskh" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.633070 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.633327 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.633561 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.647395 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0842201f-d8df-4376-b130-4bc0c560dc37-metrics-certs\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.651630 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-68d546b9d8-2t9km"] Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.652695 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.655258 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.657291 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv5tk\" (UniqueName: \"kubernetes.io/projected/0842201f-d8df-4376-b130-4bc0c560dc37-kube-api-access-tv5tk\") pod \"frr-k8s-w9rrg\" (UID: \"0842201f-d8df-4376-b130-4bc0c560dc37\") " pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.662574 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-2t9km"] Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.726644 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-memberlist\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.726712 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-metrics-certs\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.726745 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfkrc\" (UniqueName: \"kubernetes.io/projected/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-kube-api-access-kfkrc\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.726781 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77a8d6d5-aa09-4168-8c4f-228849d999e2-cert\") pod \"controller-68d546b9d8-2t9km\" (UID: \"77a8d6d5-aa09-4168-8c4f-228849d999e2\") " pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.726807 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqgrv\" (UniqueName: \"kubernetes.io/projected/b014985a-51e5-494a-a16b-c126e6fce6b3-kube-api-access-zqgrv\") pod \"frr-k8s-webhook-server-64bf5d555-t5brv\" (UID: \"b014985a-51e5-494a-a16b-c126e6fce6b3\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.726837 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-metallb-excludel2\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.726864 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p7kd\" (UniqueName: \"kubernetes.io/projected/77a8d6d5-aa09-4168-8c4f-228849d999e2-kube-api-access-6p7kd\") pod \"controller-68d546b9d8-2t9km\" (UID: \"77a8d6d5-aa09-4168-8c4f-228849d999e2\") " pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.726907 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b014985a-51e5-494a-a16b-c126e6fce6b3-cert\") pod \"frr-k8s-webhook-server-64bf5d555-t5brv\" (UID: \"b014985a-51e5-494a-a16b-c126e6fce6b3\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.726927 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77a8d6d5-aa09-4168-8c4f-228849d999e2-metrics-certs\") pod \"controller-68d546b9d8-2t9km\" (UID: \"77a8d6d5-aa09-4168-8c4f-228849d999e2\") " pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.731182 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b014985a-51e5-494a-a16b-c126e6fce6b3-cert\") pod \"frr-k8s-webhook-server-64bf5d555-t5brv\" (UID: \"b014985a-51e5-494a-a16b-c126e6fce6b3\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.741801 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqgrv\" (UniqueName: \"kubernetes.io/projected/b014985a-51e5-494a-a16b-c126e6fce6b3-kube-api-access-zqgrv\") pod \"frr-k8s-webhook-server-64bf5d555-t5brv\" (UID: \"b014985a-51e5-494a-a16b-c126e6fce6b3\") " pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.827567 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-metallb-excludel2\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.827615 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p7kd\" (UniqueName: \"kubernetes.io/projected/77a8d6d5-aa09-4168-8c4f-228849d999e2-kube-api-access-6p7kd\") pod \"controller-68d546b9d8-2t9km\" (UID: \"77a8d6d5-aa09-4168-8c4f-228849d999e2\") " pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.827709 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77a8d6d5-aa09-4168-8c4f-228849d999e2-metrics-certs\") pod \"controller-68d546b9d8-2t9km\" (UID: \"77a8d6d5-aa09-4168-8c4f-228849d999e2\") " pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.827778 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-memberlist\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.827805 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-metrics-certs\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.827822 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfkrc\" (UniqueName: \"kubernetes.io/projected/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-kube-api-access-kfkrc\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.827856 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77a8d6d5-aa09-4168-8c4f-228849d999e2-cert\") pod \"controller-68d546b9d8-2t9km\" (UID: \"77a8d6d5-aa09-4168-8c4f-228849d999e2\") " pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:04 crc kubenswrapper[4974]: E1013 18:27:04.828007 4974 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 13 18:27:04 crc kubenswrapper[4974]: E1013 18:27:04.828073 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-memberlist podName:6d0e7abb-aa57-48af-9a9a-d3c626b9131a nodeName:}" failed. No retries permitted until 2025-10-13 18:27:05.328054236 +0000 UTC m=+760.232420416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-memberlist") pod "speaker-lc487" (UID: "6d0e7abb-aa57-48af-9a9a-d3c626b9131a") : secret "metallb-memberlist" not found Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.830631 4974 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.830947 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-metallb-excludel2\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.830990 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-metrics-certs\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.835006 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/77a8d6d5-aa09-4168-8c4f-228849d999e2-metrics-certs\") pod \"controller-68d546b9d8-2t9km\" (UID: \"77a8d6d5-aa09-4168-8c4f-228849d999e2\") " pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.841489 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/77a8d6d5-aa09-4168-8c4f-228849d999e2-cert\") pod \"controller-68d546b9d8-2t9km\" (UID: \"77a8d6d5-aa09-4168-8c4f-228849d999e2\") " pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.845086 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.845808 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfkrc\" (UniqueName: \"kubernetes.io/projected/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-kube-api-access-kfkrc\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.846080 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p7kd\" (UniqueName: \"kubernetes.io/projected/77a8d6d5-aa09-4168-8c4f-228849d999e2-kube-api-access-6p7kd\") pod \"controller-68d546b9d8-2t9km\" (UID: \"77a8d6d5-aa09-4168-8c4f-228849d999e2\") " pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:04 crc kubenswrapper[4974]: I1013 18:27:04.858691 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" Oct 13 18:27:05 crc kubenswrapper[4974]: I1013 18:27:05.009872 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:05 crc kubenswrapper[4974]: I1013 18:27:05.335113 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-memberlist\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:05 crc kubenswrapper[4974]: E1013 18:27:05.335376 4974 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Oct 13 18:27:05 crc kubenswrapper[4974]: E1013 18:27:05.335494 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-memberlist podName:6d0e7abb-aa57-48af-9a9a-d3c626b9131a nodeName:}" failed. No retries permitted until 2025-10-13 18:27:06.335467683 +0000 UTC m=+761.239833783 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-memberlist") pod "speaker-lc487" (UID: "6d0e7abb-aa57-48af-9a9a-d3c626b9131a") : secret "metallb-memberlist" not found Oct 13 18:27:05 crc kubenswrapper[4974]: I1013 18:27:05.394635 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv"] Oct 13 18:27:05 crc kubenswrapper[4974]: I1013 18:27:05.479705 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-68d546b9d8-2t9km"] Oct 13 18:27:05 crc kubenswrapper[4974]: W1013 18:27:05.488514 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77a8d6d5_aa09_4168_8c4f_228849d999e2.slice/crio-f06d9e3d0ccd04d6484fb054c78b760270e6e154cfa86c8bcef47cd2d84f86b0 WatchSource:0}: Error finding container f06d9e3d0ccd04d6484fb054c78b760270e6e154cfa86c8bcef47cd2d84f86b0: Status 404 returned error can't find the container with id f06d9e3d0ccd04d6484fb054c78b760270e6e154cfa86c8bcef47cd2d84f86b0 Oct 13 18:27:06 crc kubenswrapper[4974]: I1013 18:27:06.007242 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-2t9km" event={"ID":"77a8d6d5-aa09-4168-8c4f-228849d999e2","Type":"ContainerStarted","Data":"03b4d9b5da5fcf30ef5aa963c803ea7a20f919d8ea622c682c937f920c22894c"} Oct 13 18:27:06 crc kubenswrapper[4974]: I1013 18:27:06.007812 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-2t9km" event={"ID":"77a8d6d5-aa09-4168-8c4f-228849d999e2","Type":"ContainerStarted","Data":"407818872e296538dfdb3d2e5ff25b4b193729f215173b84e8cd53efe3ebc64f"} Oct 13 18:27:06 crc kubenswrapper[4974]: I1013 18:27:06.007845 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-68d546b9d8-2t9km" event={"ID":"77a8d6d5-aa09-4168-8c4f-228849d999e2","Type":"ContainerStarted","Data":"f06d9e3d0ccd04d6484fb054c78b760270e6e154cfa86c8bcef47cd2d84f86b0"} Oct 13 18:27:06 crc kubenswrapper[4974]: I1013 18:27:06.007871 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:06 crc kubenswrapper[4974]: I1013 18:27:06.010461 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w9rrg" event={"ID":"0842201f-d8df-4376-b130-4bc0c560dc37","Type":"ContainerStarted","Data":"fb38f1a99a821c431632fc32b03b8c17e03acd9539fe3e9032f80177e47e5350"} Oct 13 18:27:06 crc kubenswrapper[4974]: I1013 18:27:06.012055 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" event={"ID":"b014985a-51e5-494a-a16b-c126e6fce6b3","Type":"ContainerStarted","Data":"6e9e043a471c94bb7bece55e7043bed2efd10ef26db4bb01730eac410f7bc3e3"} Oct 13 18:27:06 crc kubenswrapper[4974]: I1013 18:27:06.038614 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-68d546b9d8-2t9km" podStartSLOduration=2.038582161 podStartE2EDuration="2.038582161s" podCreationTimestamp="2025-10-13 18:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:27:06.034412173 +0000 UTC m=+760.938778293" watchObservedRunningTime="2025-10-13 18:27:06.038582161 +0000 UTC m=+760.942948281" Oct 13 18:27:06 crc kubenswrapper[4974]: I1013 18:27:06.348553 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-memberlist\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:06 crc kubenswrapper[4974]: I1013 18:27:06.371124 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/6d0e7abb-aa57-48af-9a9a-d3c626b9131a-memberlist\") pod \"speaker-lc487\" (UID: \"6d0e7abb-aa57-48af-9a9a-d3c626b9131a\") " pod="metallb-system/speaker-lc487" Oct 13 18:27:06 crc kubenswrapper[4974]: I1013 18:27:06.502355 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lc487" Oct 13 18:27:06 crc kubenswrapper[4974]: W1013 18:27:06.545224 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d0e7abb_aa57_48af_9a9a_d3c626b9131a.slice/crio-554e2fee7befb441f1f2feb6a86ce25646124557401c37509a67262118fa4b7b WatchSource:0}: Error finding container 554e2fee7befb441f1f2feb6a86ce25646124557401c37509a67262118fa4b7b: Status 404 returned error can't find the container with id 554e2fee7befb441f1f2feb6a86ce25646124557401c37509a67262118fa4b7b Oct 13 18:27:07 crc kubenswrapper[4974]: I1013 18:27:07.019875 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lc487" event={"ID":"6d0e7abb-aa57-48af-9a9a-d3c626b9131a","Type":"ContainerStarted","Data":"82d3bc9ad95165bcf5a62dca1cb881e2564310284daa3b6ea4132a31e17252d5"} Oct 13 18:27:07 crc kubenswrapper[4974]: I1013 18:27:07.019909 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lc487" event={"ID":"6d0e7abb-aa57-48af-9a9a-d3c626b9131a","Type":"ContainerStarted","Data":"554e2fee7befb441f1f2feb6a86ce25646124557401c37509a67262118fa4b7b"} Oct 13 18:27:08 crc kubenswrapper[4974]: I1013 18:27:08.031996 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lc487" event={"ID":"6d0e7abb-aa57-48af-9a9a-d3c626b9131a","Type":"ContainerStarted","Data":"dbcee02ac2032abd3f76f2b03cda6d8aa4815dc610474dc574a95bc412fde250"} Oct 13 18:27:08 crc kubenswrapper[4974]: I1013 18:27:08.032133 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-lc487" Oct 13 18:27:08 crc kubenswrapper[4974]: I1013 18:27:08.062659 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-lc487" podStartSLOduration=4.062623798 podStartE2EDuration="4.062623798s" podCreationTimestamp="2025-10-13 18:27:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:27:08.059435927 +0000 UTC m=+762.963802017" watchObservedRunningTime="2025-10-13 18:27:08.062623798 +0000 UTC m=+762.966989878" Oct 13 18:27:13 crc kubenswrapper[4974]: I1013 18:27:13.094965 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" event={"ID":"b014985a-51e5-494a-a16b-c126e6fce6b3","Type":"ContainerStarted","Data":"f61d10df4c433ef18a771f167a8a0afea31378a0fb3a68c7ce858953ffc10d2c"} Oct 13 18:27:13 crc kubenswrapper[4974]: I1013 18:27:13.095561 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" Oct 13 18:27:13 crc kubenswrapper[4974]: I1013 18:27:13.097790 4974 generic.go:334] "Generic (PLEG): container finished" podID="0842201f-d8df-4376-b130-4bc0c560dc37" containerID="f7e5224e61095c3819ca6faf527afbaa0cb98aa63ec5e3162d40c95061b22e9d" exitCode=0 Oct 13 18:27:13 crc kubenswrapper[4974]: I1013 18:27:13.097832 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w9rrg" event={"ID":"0842201f-d8df-4376-b130-4bc0c560dc37","Type":"ContainerDied","Data":"f7e5224e61095c3819ca6faf527afbaa0cb98aa63ec5e3162d40c95061b22e9d"} Oct 13 18:27:13 crc kubenswrapper[4974]: I1013 18:27:13.150723 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" podStartSLOduration=1.715496163 podStartE2EDuration="9.150692302s" podCreationTimestamp="2025-10-13 18:27:04 +0000 UTC" firstStartedPulling="2025-10-13 18:27:05.403954853 +0000 UTC m=+760.308320963" lastFinishedPulling="2025-10-13 18:27:12.839150982 +0000 UTC m=+767.743517102" observedRunningTime="2025-10-13 18:27:13.118228968 +0000 UTC m=+768.022595058" watchObservedRunningTime="2025-10-13 18:27:13.150692302 +0000 UTC m=+768.055058382" Oct 13 18:27:14 crc kubenswrapper[4974]: I1013 18:27:14.107430 4974 generic.go:334] "Generic (PLEG): container finished" podID="0842201f-d8df-4376-b130-4bc0c560dc37" containerID="a2cf342a57224d9afd7529ceeef917358b5e00536dc701df9757cb73a19ab896" exitCode=0 Oct 13 18:27:14 crc kubenswrapper[4974]: I1013 18:27:14.107490 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w9rrg" event={"ID":"0842201f-d8df-4376-b130-4bc0c560dc37","Type":"ContainerDied","Data":"a2cf342a57224d9afd7529ceeef917358b5e00536dc701df9757cb73a19ab896"} Oct 13 18:27:15 crc kubenswrapper[4974]: I1013 18:27:15.017686 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-68d546b9d8-2t9km" Oct 13 18:27:15 crc kubenswrapper[4974]: I1013 18:27:15.121234 4974 generic.go:334] "Generic (PLEG): container finished" podID="0842201f-d8df-4376-b130-4bc0c560dc37" containerID="b2ff87d00ac5936cbd5ea80335a2248acb4828d1ffdc431ab128ba8e89590464" exitCode=0 Oct 13 18:27:15 crc kubenswrapper[4974]: I1013 18:27:15.121300 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w9rrg" event={"ID":"0842201f-d8df-4376-b130-4bc0c560dc37","Type":"ContainerDied","Data":"b2ff87d00ac5936cbd5ea80335a2248acb4828d1ffdc431ab128ba8e89590464"} Oct 13 18:27:16 crc kubenswrapper[4974]: I1013 18:27:16.147235 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w9rrg" event={"ID":"0842201f-d8df-4376-b130-4bc0c560dc37","Type":"ContainerStarted","Data":"ffeb23e500f86bfeef14cf8bc4b155735a61513a9a87002a2df05b79b0fa1a3a"} Oct 13 18:27:16 crc kubenswrapper[4974]: I1013 18:27:16.147595 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w9rrg" event={"ID":"0842201f-d8df-4376-b130-4bc0c560dc37","Type":"ContainerStarted","Data":"fc8dd19199f92773462eb99da05b591bf9943d13b817ee6135d03c9bbf9929a5"} Oct 13 18:27:16 crc kubenswrapper[4974]: I1013 18:27:16.147612 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w9rrg" event={"ID":"0842201f-d8df-4376-b130-4bc0c560dc37","Type":"ContainerStarted","Data":"11138933ec3156347f0e72369a025b72e87b21eb47d88f85133864ffdc410cd8"} Oct 13 18:27:16 crc kubenswrapper[4974]: I1013 18:27:16.147627 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w9rrg" event={"ID":"0842201f-d8df-4376-b130-4bc0c560dc37","Type":"ContainerStarted","Data":"2c6024ee8382d7325cf5032add5b4692b79b01af066d94cbf02f0b4f22f16829"} Oct 13 18:27:16 crc kubenswrapper[4974]: I1013 18:27:16.147642 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w9rrg" event={"ID":"0842201f-d8df-4376-b130-4bc0c560dc37","Type":"ContainerStarted","Data":"7808c336a0f371afe7806ee38e4ea1a2d25fd2d7b9b34dd18a6b4c03e7a20d69"} Oct 13 18:27:16 crc kubenswrapper[4974]: I1013 18:27:16.508016 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-lc487" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.162164 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w9rrg" event={"ID":"0842201f-d8df-4376-b130-4bc0c560dc37","Type":"ContainerStarted","Data":"41cc7bbf952e35365eeb35e65c8d0305b0c20900b646b4b298907a5160810bd0"} Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.163452 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.200053 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-w9rrg" podStartSLOduration=5.373205812 podStartE2EDuration="13.200024682s" podCreationTimestamp="2025-10-13 18:27:04 +0000 UTC" firstStartedPulling="2025-10-13 18:27:05.001264357 +0000 UTC m=+759.905630437" lastFinishedPulling="2025-10-13 18:27:12.828083197 +0000 UTC m=+767.732449307" observedRunningTime="2025-10-13 18:27:17.200020902 +0000 UTC m=+772.104386992" watchObservedRunningTime="2025-10-13 18:27:17.200024682 +0000 UTC m=+772.104390812" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.454498 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q2b7w"] Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.461607 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.471777 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2b7w"] Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.619923 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-catalog-content\") pod \"community-operators-q2b7w\" (UID: \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\") " pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.619986 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvwbw\" (UniqueName: \"kubernetes.io/projected/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-kube-api-access-kvwbw\") pod \"community-operators-q2b7w\" (UID: \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\") " pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.620135 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-utilities\") pod \"community-operators-q2b7w\" (UID: \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\") " pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.721267 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvwbw\" (UniqueName: \"kubernetes.io/projected/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-kube-api-access-kvwbw\") pod \"community-operators-q2b7w\" (UID: \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\") " pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.721366 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-utilities\") pod \"community-operators-q2b7w\" (UID: \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\") " pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.721408 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-catalog-content\") pod \"community-operators-q2b7w\" (UID: \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\") " pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.722215 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-catalog-content\") pod \"community-operators-q2b7w\" (UID: \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\") " pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.722358 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-utilities\") pod \"community-operators-q2b7w\" (UID: \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\") " pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.741837 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvwbw\" (UniqueName: \"kubernetes.io/projected/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-kube-api-access-kvwbw\") pod \"community-operators-q2b7w\" (UID: \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\") " pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:17 crc kubenswrapper[4974]: I1013 18:27:17.793771 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:18 crc kubenswrapper[4974]: I1013 18:27:18.285253 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2b7w"] Oct 13 18:27:18 crc kubenswrapper[4974]: W1013 18:27:18.294117 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29cb39ac_65cc_4ad5_9cdb_ebe2c0ce4cba.slice/crio-c76db4f414d164c422da451b7b7019e570caa1d1cbe316461e1b5c14ed7af53e WatchSource:0}: Error finding container c76db4f414d164c422da451b7b7019e570caa1d1cbe316461e1b5c14ed7af53e: Status 404 returned error can't find the container with id c76db4f414d164c422da451b7b7019e570caa1d1cbe316461e1b5c14ed7af53e Oct 13 18:27:19 crc kubenswrapper[4974]: I1013 18:27:19.180086 4974 generic.go:334] "Generic (PLEG): container finished" podID="29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" containerID="60009721527d35e09e0dc2cffde9a91af3b63ef187c409668fd0afbe1bdb282b" exitCode=0 Oct 13 18:27:19 crc kubenswrapper[4974]: I1013 18:27:19.180295 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2b7w" event={"ID":"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba","Type":"ContainerDied","Data":"60009721527d35e09e0dc2cffde9a91af3b63ef187c409668fd0afbe1bdb282b"} Oct 13 18:27:19 crc kubenswrapper[4974]: I1013 18:27:19.180520 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2b7w" event={"ID":"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba","Type":"ContainerStarted","Data":"c76db4f414d164c422da451b7b7019e570caa1d1cbe316461e1b5c14ed7af53e"} Oct 13 18:27:19 crc kubenswrapper[4974]: I1013 18:27:19.846323 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:19 crc kubenswrapper[4974]: I1013 18:27:19.902732 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:22 crc kubenswrapper[4974]: I1013 18:27:22.207257 4974 generic.go:334] "Generic (PLEG): container finished" podID="29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" containerID="0ccae8d54b54f5985ae61a2019ca4798b36371dcbb641eafd0fec4c988f0a99d" exitCode=0 Oct 13 18:27:22 crc kubenswrapper[4974]: I1013 18:27:22.207413 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2b7w" event={"ID":"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba","Type":"ContainerDied","Data":"0ccae8d54b54f5985ae61a2019ca4798b36371dcbb641eafd0fec4c988f0a99d"} Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.029956 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zdv77"] Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.031233 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zdv77" Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.035938 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.036015 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-fld4p" Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.037981 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.046582 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zdv77"] Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.195690 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69mmp\" (UniqueName: \"kubernetes.io/projected/208151b9-4d45-4a71-9417-5082f935fd8b-kube-api-access-69mmp\") pod \"openstack-operator-index-zdv77\" (UID: \"208151b9-4d45-4a71-9417-5082f935fd8b\") " pod="openstack-operators/openstack-operator-index-zdv77" Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.216966 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2b7w" event={"ID":"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba","Type":"ContainerStarted","Data":"d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b"} Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.234514 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q2b7w" podStartSLOduration=2.797432212 podStartE2EDuration="6.234493069s" podCreationTimestamp="2025-10-13 18:27:17 +0000 UTC" firstStartedPulling="2025-10-13 18:27:19.182789783 +0000 UTC m=+774.087155893" lastFinishedPulling="2025-10-13 18:27:22.61985068 +0000 UTC m=+777.524216750" observedRunningTime="2025-10-13 18:27:23.233372557 +0000 UTC m=+778.137738667" watchObservedRunningTime="2025-10-13 18:27:23.234493069 +0000 UTC m=+778.138859169" Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.297082 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69mmp\" (UniqueName: \"kubernetes.io/projected/208151b9-4d45-4a71-9417-5082f935fd8b-kube-api-access-69mmp\") pod \"openstack-operator-index-zdv77\" (UID: \"208151b9-4d45-4a71-9417-5082f935fd8b\") " pod="openstack-operators/openstack-operator-index-zdv77" Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.316267 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69mmp\" (UniqueName: \"kubernetes.io/projected/208151b9-4d45-4a71-9417-5082f935fd8b-kube-api-access-69mmp\") pod \"openstack-operator-index-zdv77\" (UID: \"208151b9-4d45-4a71-9417-5082f935fd8b\") " pod="openstack-operators/openstack-operator-index-zdv77" Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.351299 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zdv77" Oct 13 18:27:23 crc kubenswrapper[4974]: I1013 18:27:23.849473 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zdv77"] Oct 13 18:27:24 crc kubenswrapper[4974]: I1013 18:27:24.229184 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zdv77" event={"ID":"208151b9-4d45-4a71-9417-5082f935fd8b","Type":"ContainerStarted","Data":"2b42b37f8ce9c88f9aba920ed7824d69e1850a7b3be2078f2bac9bd91d7916e6"} Oct 13 18:27:24 crc kubenswrapper[4974]: I1013 18:27:24.865350 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-64bf5d555-t5brv" Oct 13 18:27:26 crc kubenswrapper[4974]: I1013 18:27:26.121565 4974 scope.go:117] "RemoveContainer" containerID="dbb989c50e965a36408c384d0e26c3a7a0f1425d5c5c34894fa15048ad73f0f1" Oct 13 18:27:27 crc kubenswrapper[4974]: I1013 18:27:27.255848 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zdv77" event={"ID":"208151b9-4d45-4a71-9417-5082f935fd8b","Type":"ContainerStarted","Data":"e7d53ec48ba1116c03440645d0a798ef06fa3d29d12c2cb1f1a7294d797cb7e0"} Oct 13 18:27:27 crc kubenswrapper[4974]: I1013 18:27:27.281037 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zdv77" podStartSLOduration=1.31417054 podStartE2EDuration="4.281007575s" podCreationTimestamp="2025-10-13 18:27:23 +0000 UTC" firstStartedPulling="2025-10-13 18:27:23.86789553 +0000 UTC m=+778.772261640" lastFinishedPulling="2025-10-13 18:27:26.834732555 +0000 UTC m=+781.739098675" observedRunningTime="2025-10-13 18:27:27.271791883 +0000 UTC m=+782.176158003" watchObservedRunningTime="2025-10-13 18:27:27.281007575 +0000 UTC m=+782.185373685" Oct 13 18:27:27 crc kubenswrapper[4974]: I1013 18:27:27.794715 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:27 crc kubenswrapper[4974]: I1013 18:27:27.795126 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:27 crc kubenswrapper[4974]: I1013 18:27:27.863788 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.033913 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zgldx"] Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.037196 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.055354 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zgldx"] Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.166890 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54d8fba9-2490-4ce9-a407-f652fb5a329c-catalog-content\") pod \"certified-operators-zgldx\" (UID: \"54d8fba9-2490-4ce9-a407-f652fb5a329c\") " pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.167024 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54d8fba9-2490-4ce9-a407-f652fb5a329c-utilities\") pod \"certified-operators-zgldx\" (UID: \"54d8fba9-2490-4ce9-a407-f652fb5a329c\") " pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.167070 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kj4p\" (UniqueName: \"kubernetes.io/projected/54d8fba9-2490-4ce9-a407-f652fb5a329c-kube-api-access-9kj4p\") pod \"certified-operators-zgldx\" (UID: \"54d8fba9-2490-4ce9-a407-f652fb5a329c\") " pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.268088 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54d8fba9-2490-4ce9-a407-f652fb5a329c-catalog-content\") pod \"certified-operators-zgldx\" (UID: \"54d8fba9-2490-4ce9-a407-f652fb5a329c\") " pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.268883 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54d8fba9-2490-4ce9-a407-f652fb5a329c-catalog-content\") pod \"certified-operators-zgldx\" (UID: \"54d8fba9-2490-4ce9-a407-f652fb5a329c\") " pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.268991 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54d8fba9-2490-4ce9-a407-f652fb5a329c-utilities\") pod \"certified-operators-zgldx\" (UID: \"54d8fba9-2490-4ce9-a407-f652fb5a329c\") " pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.269081 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kj4p\" (UniqueName: \"kubernetes.io/projected/54d8fba9-2490-4ce9-a407-f652fb5a329c-kube-api-access-9kj4p\") pod \"certified-operators-zgldx\" (UID: \"54d8fba9-2490-4ce9-a407-f652fb5a329c\") " pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.269538 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54d8fba9-2490-4ce9-a407-f652fb5a329c-utilities\") pod \"certified-operators-zgldx\" (UID: \"54d8fba9-2490-4ce9-a407-f652fb5a329c\") " pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.310203 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kj4p\" (UniqueName: \"kubernetes.io/projected/54d8fba9-2490-4ce9-a407-f652fb5a329c-kube-api-access-9kj4p\") pod \"certified-operators-zgldx\" (UID: \"54d8fba9-2490-4ce9-a407-f652fb5a329c\") " pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.331070 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.366204 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:28 crc kubenswrapper[4974]: I1013 18:27:28.893110 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zgldx"] Oct 13 18:27:28 crc kubenswrapper[4974]: W1013 18:27:28.903932 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54d8fba9_2490_4ce9_a407_f652fb5a329c.slice/crio-4055cda9f3ec9528cb37f30e1396fc814fae0260f71e903bff84803464571a1c WatchSource:0}: Error finding container 4055cda9f3ec9528cb37f30e1396fc814fae0260f71e903bff84803464571a1c: Status 404 returned error can't find the container with id 4055cda9f3ec9528cb37f30e1396fc814fae0260f71e903bff84803464571a1c Oct 13 18:27:29 crc kubenswrapper[4974]: I1013 18:27:29.279185 4974 generic.go:334] "Generic (PLEG): container finished" podID="54d8fba9-2490-4ce9-a407-f652fb5a329c" containerID="5d4b986c77262347c350da46780b246e1dcb613701df4c8d0da307ae250da3ca" exitCode=0 Oct 13 18:27:29 crc kubenswrapper[4974]: I1013 18:27:29.279286 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgldx" event={"ID":"54d8fba9-2490-4ce9-a407-f652fb5a329c","Type":"ContainerDied","Data":"5d4b986c77262347c350da46780b246e1dcb613701df4c8d0da307ae250da3ca"} Oct 13 18:27:29 crc kubenswrapper[4974]: I1013 18:27:29.279885 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgldx" event={"ID":"54d8fba9-2490-4ce9-a407-f652fb5a329c","Type":"ContainerStarted","Data":"4055cda9f3ec9528cb37f30e1396fc814fae0260f71e903bff84803464571a1c"} Oct 13 18:27:30 crc kubenswrapper[4974]: I1013 18:27:30.291915 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgldx" event={"ID":"54d8fba9-2490-4ce9-a407-f652fb5a329c","Type":"ContainerStarted","Data":"43406b4dae937a426811f28ceb5c34749ed62f6be841d258e5b1544ac12fe8c1"} Oct 13 18:27:31 crc kubenswrapper[4974]: I1013 18:27:31.306575 4974 generic.go:334] "Generic (PLEG): container finished" podID="54d8fba9-2490-4ce9-a407-f652fb5a329c" containerID="43406b4dae937a426811f28ceb5c34749ed62f6be841d258e5b1544ac12fe8c1" exitCode=0 Oct 13 18:27:31 crc kubenswrapper[4974]: I1013 18:27:31.306612 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgldx" event={"ID":"54d8fba9-2490-4ce9-a407-f652fb5a329c","Type":"ContainerDied","Data":"43406b4dae937a426811f28ceb5c34749ed62f6be841d258e5b1544ac12fe8c1"} Oct 13 18:27:31 crc kubenswrapper[4974]: I1013 18:27:31.619393 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2b7w"] Oct 13 18:27:31 crc kubenswrapper[4974]: I1013 18:27:31.619772 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q2b7w" podUID="29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" containerName="registry-server" containerID="cri-o://d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b" gracePeriod=2 Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.130153 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.227754 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-utilities\") pod \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\" (UID: \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\") " Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.227936 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-catalog-content\") pod \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\" (UID: \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\") " Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.227983 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvwbw\" (UniqueName: \"kubernetes.io/projected/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-kube-api-access-kvwbw\") pod \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\" (UID: \"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba\") " Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.230407 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-utilities" (OuterVolumeSpecName: "utilities") pod "29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" (UID: "29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.236810 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-kube-api-access-kvwbw" (OuterVolumeSpecName: "kube-api-access-kvwbw") pod "29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" (UID: "29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba"). InnerVolumeSpecName "kube-api-access-kvwbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.286402 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" (UID: "29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.314552 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgldx" event={"ID":"54d8fba9-2490-4ce9-a407-f652fb5a329c","Type":"ContainerStarted","Data":"38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9"} Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.316739 4974 generic.go:334] "Generic (PLEG): container finished" podID="29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" containerID="d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b" exitCode=0 Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.316781 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2b7w" event={"ID":"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba","Type":"ContainerDied","Data":"d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b"} Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.316812 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2b7w" event={"ID":"29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba","Type":"ContainerDied","Data":"c76db4f414d164c422da451b7b7019e570caa1d1cbe316461e1b5c14ed7af53e"} Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.316830 4974 scope.go:117] "RemoveContainer" containerID="d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.316836 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2b7w" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.329682 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.329719 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvwbw\" (UniqueName: \"kubernetes.io/projected/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-kube-api-access-kvwbw\") on node \"crc\" DevicePath \"\"" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.329733 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.332462 4974 scope.go:117] "RemoveContainer" containerID="0ccae8d54b54f5985ae61a2019ca4798b36371dcbb641eafd0fec4c988f0a99d" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.336707 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zgldx" podStartSLOduration=1.9154013810000001 podStartE2EDuration="4.336690289s" podCreationTimestamp="2025-10-13 18:27:28 +0000 UTC" firstStartedPulling="2025-10-13 18:27:29.282265068 +0000 UTC m=+784.186631178" lastFinishedPulling="2025-10-13 18:27:31.703554006 +0000 UTC m=+786.607920086" observedRunningTime="2025-10-13 18:27:32.332913592 +0000 UTC m=+787.237279672" watchObservedRunningTime="2025-10-13 18:27:32.336690289 +0000 UTC m=+787.241056369" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.351794 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2b7w"] Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.356249 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q2b7w"] Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.358675 4974 scope.go:117] "RemoveContainer" containerID="60009721527d35e09e0dc2cffde9a91af3b63ef187c409668fd0afbe1bdb282b" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.375100 4974 scope.go:117] "RemoveContainer" containerID="d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b" Oct 13 18:27:32 crc kubenswrapper[4974]: E1013 18:27:32.375458 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b\": container with ID starting with d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b not found: ID does not exist" containerID="d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.375498 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b"} err="failed to get container status \"d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b\": rpc error: code = NotFound desc = could not find container \"d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b\": container with ID starting with d948d1ea18dfcf74813bf02fb8a01e1fd9419efa987a585c32639c200432372b not found: ID does not exist" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.375524 4974 scope.go:117] "RemoveContainer" containerID="0ccae8d54b54f5985ae61a2019ca4798b36371dcbb641eafd0fec4c988f0a99d" Oct 13 18:27:32 crc kubenswrapper[4974]: E1013 18:27:32.375876 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ccae8d54b54f5985ae61a2019ca4798b36371dcbb641eafd0fec4c988f0a99d\": container with ID starting with 0ccae8d54b54f5985ae61a2019ca4798b36371dcbb641eafd0fec4c988f0a99d not found: ID does not exist" containerID="0ccae8d54b54f5985ae61a2019ca4798b36371dcbb641eafd0fec4c988f0a99d" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.375919 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ccae8d54b54f5985ae61a2019ca4798b36371dcbb641eafd0fec4c988f0a99d"} err="failed to get container status \"0ccae8d54b54f5985ae61a2019ca4798b36371dcbb641eafd0fec4c988f0a99d\": rpc error: code = NotFound desc = could not find container \"0ccae8d54b54f5985ae61a2019ca4798b36371dcbb641eafd0fec4c988f0a99d\": container with ID starting with 0ccae8d54b54f5985ae61a2019ca4798b36371dcbb641eafd0fec4c988f0a99d not found: ID does not exist" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.375939 4974 scope.go:117] "RemoveContainer" containerID="60009721527d35e09e0dc2cffde9a91af3b63ef187c409668fd0afbe1bdb282b" Oct 13 18:27:32 crc kubenswrapper[4974]: E1013 18:27:32.376152 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60009721527d35e09e0dc2cffde9a91af3b63ef187c409668fd0afbe1bdb282b\": container with ID starting with 60009721527d35e09e0dc2cffde9a91af3b63ef187c409668fd0afbe1bdb282b not found: ID does not exist" containerID="60009721527d35e09e0dc2cffde9a91af3b63ef187c409668fd0afbe1bdb282b" Oct 13 18:27:32 crc kubenswrapper[4974]: I1013 18:27:32.376178 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60009721527d35e09e0dc2cffde9a91af3b63ef187c409668fd0afbe1bdb282b"} err="failed to get container status \"60009721527d35e09e0dc2cffde9a91af3b63ef187c409668fd0afbe1bdb282b\": rpc error: code = NotFound desc = could not find container \"60009721527d35e09e0dc2cffde9a91af3b63ef187c409668fd0afbe1bdb282b\": container with ID starting with 60009721527d35e09e0dc2cffde9a91af3b63ef187c409668fd0afbe1bdb282b not found: ID does not exist" Oct 13 18:27:33 crc kubenswrapper[4974]: I1013 18:27:33.351493 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zdv77" Oct 13 18:27:33 crc kubenswrapper[4974]: I1013 18:27:33.351962 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zdv77" Oct 13 18:27:33 crc kubenswrapper[4974]: I1013 18:27:33.395893 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zdv77" Oct 13 18:27:33 crc kubenswrapper[4974]: I1013 18:27:33.820849 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" path="/var/lib/kubelet/pods/29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba/volumes" Oct 13 18:27:34 crc kubenswrapper[4974]: I1013 18:27:34.375617 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zdv77" Oct 13 18:27:34 crc kubenswrapper[4974]: I1013 18:27:34.853064 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-w9rrg" Oct 13 18:27:37 crc kubenswrapper[4974]: I1013 18:27:37.742921 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:27:37 crc kubenswrapper[4974]: I1013 18:27:37.743017 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:27:37 crc kubenswrapper[4974]: I1013 18:27:37.886768 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8"] Oct 13 18:27:37 crc kubenswrapper[4974]: E1013 18:27:37.888298 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" containerName="registry-server" Oct 13 18:27:37 crc kubenswrapper[4974]: I1013 18:27:37.888339 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" containerName="registry-server" Oct 13 18:27:37 crc kubenswrapper[4974]: E1013 18:27:37.888359 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" containerName="extract-content" Oct 13 18:27:37 crc kubenswrapper[4974]: I1013 18:27:37.888368 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" containerName="extract-content" Oct 13 18:27:37 crc kubenswrapper[4974]: E1013 18:27:37.888386 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" containerName="extract-utilities" Oct 13 18:27:37 crc kubenswrapper[4974]: I1013 18:27:37.888398 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" containerName="extract-utilities" Oct 13 18:27:37 crc kubenswrapper[4974]: I1013 18:27:37.888611 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="29cb39ac-65cc-4ad5-9cdb-ebe2c0ce4cba" containerName="registry-server" Oct 13 18:27:37 crc kubenswrapper[4974]: I1013 18:27:37.889927 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:37 crc kubenswrapper[4974]: I1013 18:27:37.892977 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-zbwkk" Oct 13 18:27:37 crc kubenswrapper[4974]: I1013 18:27:37.906449 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8"] Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.017396 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlcz9\" (UniqueName: \"kubernetes.io/projected/fb93a5ae-368f-4fce-b522-b318fa519ade-kube-api-access-mlcz9\") pod \"7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8\" (UID: \"fb93a5ae-368f-4fce-b522-b318fa519ade\") " pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.017500 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb93a5ae-368f-4fce-b522-b318fa519ade-util\") pod \"7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8\" (UID: \"fb93a5ae-368f-4fce-b522-b318fa519ade\") " pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.017526 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb93a5ae-368f-4fce-b522-b318fa519ade-bundle\") pod \"7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8\" (UID: \"fb93a5ae-368f-4fce-b522-b318fa519ade\") " pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.119493 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlcz9\" (UniqueName: \"kubernetes.io/projected/fb93a5ae-368f-4fce-b522-b318fa519ade-kube-api-access-mlcz9\") pod \"7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8\" (UID: \"fb93a5ae-368f-4fce-b522-b318fa519ade\") " pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.119591 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb93a5ae-368f-4fce-b522-b318fa519ade-util\") pod \"7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8\" (UID: \"fb93a5ae-368f-4fce-b522-b318fa519ade\") " pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.119621 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb93a5ae-368f-4fce-b522-b318fa519ade-bundle\") pod \"7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8\" (UID: \"fb93a5ae-368f-4fce-b522-b318fa519ade\") " pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.120386 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb93a5ae-368f-4fce-b522-b318fa519ade-bundle\") pod \"7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8\" (UID: \"fb93a5ae-368f-4fce-b522-b318fa519ade\") " pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.120449 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb93a5ae-368f-4fce-b522-b318fa519ade-util\") pod \"7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8\" (UID: \"fb93a5ae-368f-4fce-b522-b318fa519ade\") " pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.164051 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlcz9\" (UniqueName: \"kubernetes.io/projected/fb93a5ae-368f-4fce-b522-b318fa519ade-kube-api-access-mlcz9\") pod \"7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8\" (UID: \"fb93a5ae-368f-4fce-b522-b318fa519ade\") " pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.229282 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.366385 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.366435 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.462112 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:38 crc kubenswrapper[4974]: I1013 18:27:38.812223 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8"] Oct 13 18:27:39 crc kubenswrapper[4974]: I1013 18:27:39.383833 4974 generic.go:334] "Generic (PLEG): container finished" podID="fb93a5ae-368f-4fce-b522-b318fa519ade" containerID="09e0214abc86bae726277550036c5dc1eb2346ca6b0134fc0d6285e0bb740cb3" exitCode=0 Oct 13 18:27:39 crc kubenswrapper[4974]: I1013 18:27:39.383980 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" event={"ID":"fb93a5ae-368f-4fce-b522-b318fa519ade","Type":"ContainerDied","Data":"09e0214abc86bae726277550036c5dc1eb2346ca6b0134fc0d6285e0bb740cb3"} Oct 13 18:27:39 crc kubenswrapper[4974]: I1013 18:27:39.384944 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" event={"ID":"fb93a5ae-368f-4fce-b522-b318fa519ade","Type":"ContainerStarted","Data":"42472fa5c02d1d7ae23b4bc571849f9f8779427de70b1ffb553f5645d4f8e585"} Oct 13 18:27:39 crc kubenswrapper[4974]: I1013 18:27:39.428722 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:40 crc kubenswrapper[4974]: I1013 18:27:40.396397 4974 generic.go:334] "Generic (PLEG): container finished" podID="fb93a5ae-368f-4fce-b522-b318fa519ade" containerID="66881f5fe221759c2617214cc2ac345a912e48fdcd1b49add3ead95923d61780" exitCode=0 Oct 13 18:27:40 crc kubenswrapper[4974]: I1013 18:27:40.396470 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" event={"ID":"fb93a5ae-368f-4fce-b522-b318fa519ade","Type":"ContainerDied","Data":"66881f5fe221759c2617214cc2ac345a912e48fdcd1b49add3ead95923d61780"} Oct 13 18:27:41 crc kubenswrapper[4974]: I1013 18:27:41.406969 4974 generic.go:334] "Generic (PLEG): container finished" podID="fb93a5ae-368f-4fce-b522-b318fa519ade" containerID="f7bcab7a2dbeeacce2bb314dc3ea3d3fd355db7f439231729b423a94bd37b59f" exitCode=0 Oct 13 18:27:41 crc kubenswrapper[4974]: I1013 18:27:41.407024 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" event={"ID":"fb93a5ae-368f-4fce-b522-b318fa519ade","Type":"ContainerDied","Data":"f7bcab7a2dbeeacce2bb314dc3ea3d3fd355db7f439231729b423a94bd37b59f"} Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.026410 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xkcnk"] Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.028643 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.079381 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkcnk"] Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.217304 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2phr\" (UniqueName: \"kubernetes.io/projected/86f9637d-b499-415f-a9a4-0ffde5a536ba-kube-api-access-j2phr\") pod \"redhat-operators-xkcnk\" (UID: \"86f9637d-b499-415f-a9a4-0ffde5a536ba\") " pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.218034 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86f9637d-b499-415f-a9a4-0ffde5a536ba-catalog-content\") pod \"redhat-operators-xkcnk\" (UID: \"86f9637d-b499-415f-a9a4-0ffde5a536ba\") " pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.218272 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86f9637d-b499-415f-a9a4-0ffde5a536ba-utilities\") pod \"redhat-operators-xkcnk\" (UID: \"86f9637d-b499-415f-a9a4-0ffde5a536ba\") " pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.319254 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86f9637d-b499-415f-a9a4-0ffde5a536ba-catalog-content\") pod \"redhat-operators-xkcnk\" (UID: \"86f9637d-b499-415f-a9a4-0ffde5a536ba\") " pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.319802 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86f9637d-b499-415f-a9a4-0ffde5a536ba-utilities\") pod \"redhat-operators-xkcnk\" (UID: \"86f9637d-b499-415f-a9a4-0ffde5a536ba\") " pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.319876 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86f9637d-b499-415f-a9a4-0ffde5a536ba-catalog-content\") pod \"redhat-operators-xkcnk\" (UID: \"86f9637d-b499-415f-a9a4-0ffde5a536ba\") " pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.320205 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2phr\" (UniqueName: \"kubernetes.io/projected/86f9637d-b499-415f-a9a4-0ffde5a536ba-kube-api-access-j2phr\") pod \"redhat-operators-xkcnk\" (UID: \"86f9637d-b499-415f-a9a4-0ffde5a536ba\") " pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.320451 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86f9637d-b499-415f-a9a4-0ffde5a536ba-utilities\") pod \"redhat-operators-xkcnk\" (UID: \"86f9637d-b499-415f-a9a4-0ffde5a536ba\") " pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.352763 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2phr\" (UniqueName: \"kubernetes.io/projected/86f9637d-b499-415f-a9a4-0ffde5a536ba-kube-api-access-j2phr\") pod \"redhat-operators-xkcnk\" (UID: \"86f9637d-b499-415f-a9a4-0ffde5a536ba\") " pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.362160 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.716930 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.825392 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb93a5ae-368f-4fce-b522-b318fa519ade-bundle\") pod \"fb93a5ae-368f-4fce-b522-b318fa519ade\" (UID: \"fb93a5ae-368f-4fce-b522-b318fa519ade\") " Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.825531 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb93a5ae-368f-4fce-b522-b318fa519ade-util\") pod \"fb93a5ae-368f-4fce-b522-b318fa519ade\" (UID: \"fb93a5ae-368f-4fce-b522-b318fa519ade\") " Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.825583 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlcz9\" (UniqueName: \"kubernetes.io/projected/fb93a5ae-368f-4fce-b522-b318fa519ade-kube-api-access-mlcz9\") pod \"fb93a5ae-368f-4fce-b522-b318fa519ade\" (UID: \"fb93a5ae-368f-4fce-b522-b318fa519ade\") " Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.826330 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb93a5ae-368f-4fce-b522-b318fa519ade-bundle" (OuterVolumeSpecName: "bundle") pod "fb93a5ae-368f-4fce-b522-b318fa519ade" (UID: "fb93a5ae-368f-4fce-b522-b318fa519ade"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.831102 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb93a5ae-368f-4fce-b522-b318fa519ade-kube-api-access-mlcz9" (OuterVolumeSpecName: "kube-api-access-mlcz9") pod "fb93a5ae-368f-4fce-b522-b318fa519ade" (UID: "fb93a5ae-368f-4fce-b522-b318fa519ade"). InnerVolumeSpecName "kube-api-access-mlcz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.843900 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb93a5ae-368f-4fce-b522-b318fa519ade-util" (OuterVolumeSpecName: "util") pod "fb93a5ae-368f-4fce-b522-b318fa519ade" (UID: "fb93a5ae-368f-4fce-b522-b318fa519ade"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.875454 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xkcnk"] Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.926516 4974 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb93a5ae-368f-4fce-b522-b318fa519ade-util\") on node \"crc\" DevicePath \"\"" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.926553 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlcz9\" (UniqueName: \"kubernetes.io/projected/fb93a5ae-368f-4fce-b522-b318fa519ade-kube-api-access-mlcz9\") on node \"crc\" DevicePath \"\"" Oct 13 18:27:42 crc kubenswrapper[4974]: I1013 18:27:42.926565 4974 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb93a5ae-368f-4fce-b522-b318fa519ade-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:27:43 crc kubenswrapper[4974]: I1013 18:27:43.423879 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" event={"ID":"fb93a5ae-368f-4fce-b522-b318fa519ade","Type":"ContainerDied","Data":"42472fa5c02d1d7ae23b4bc571849f9f8779427de70b1ffb553f5645d4f8e585"} Oct 13 18:27:43 crc kubenswrapper[4974]: I1013 18:27:43.423930 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42472fa5c02d1d7ae23b4bc571849f9f8779427de70b1ffb553f5645d4f8e585" Oct 13 18:27:43 crc kubenswrapper[4974]: I1013 18:27:43.424002 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8" Oct 13 18:27:43 crc kubenswrapper[4974]: I1013 18:27:43.427482 4974 generic.go:334] "Generic (PLEG): container finished" podID="86f9637d-b499-415f-a9a4-0ffde5a536ba" containerID="a9c2b838cb2f00b6ee91a99662cca156c798b78a7f0215d29063fe5018b15eac" exitCode=0 Oct 13 18:27:43 crc kubenswrapper[4974]: I1013 18:27:43.427509 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkcnk" event={"ID":"86f9637d-b499-415f-a9a4-0ffde5a536ba","Type":"ContainerDied","Data":"a9c2b838cb2f00b6ee91a99662cca156c798b78a7f0215d29063fe5018b15eac"} Oct 13 18:27:43 crc kubenswrapper[4974]: I1013 18:27:43.427527 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkcnk" event={"ID":"86f9637d-b499-415f-a9a4-0ffde5a536ba","Type":"ContainerStarted","Data":"5d76ef10bbccb57155c9434f466699cc502ff9959ae713a58618e240d578179b"} Oct 13 18:27:43 crc kubenswrapper[4974]: I1013 18:27:43.619373 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zgldx"] Oct 13 18:27:43 crc kubenswrapper[4974]: I1013 18:27:43.620011 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zgldx" podUID="54d8fba9-2490-4ce9-a407-f652fb5a329c" containerName="registry-server" containerID="cri-o://38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9" gracePeriod=2 Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.109712 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.261625 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kj4p\" (UniqueName: \"kubernetes.io/projected/54d8fba9-2490-4ce9-a407-f652fb5a329c-kube-api-access-9kj4p\") pod \"54d8fba9-2490-4ce9-a407-f652fb5a329c\" (UID: \"54d8fba9-2490-4ce9-a407-f652fb5a329c\") " Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.261725 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54d8fba9-2490-4ce9-a407-f652fb5a329c-catalog-content\") pod \"54d8fba9-2490-4ce9-a407-f652fb5a329c\" (UID: \"54d8fba9-2490-4ce9-a407-f652fb5a329c\") " Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.261800 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54d8fba9-2490-4ce9-a407-f652fb5a329c-utilities\") pod \"54d8fba9-2490-4ce9-a407-f652fb5a329c\" (UID: \"54d8fba9-2490-4ce9-a407-f652fb5a329c\") " Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.262951 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54d8fba9-2490-4ce9-a407-f652fb5a329c-utilities" (OuterVolumeSpecName: "utilities") pod "54d8fba9-2490-4ce9-a407-f652fb5a329c" (UID: "54d8fba9-2490-4ce9-a407-f652fb5a329c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.269868 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54d8fba9-2490-4ce9-a407-f652fb5a329c-kube-api-access-9kj4p" (OuterVolumeSpecName: "kube-api-access-9kj4p") pod "54d8fba9-2490-4ce9-a407-f652fb5a329c" (UID: "54d8fba9-2490-4ce9-a407-f652fb5a329c"). InnerVolumeSpecName "kube-api-access-9kj4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.304401 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54d8fba9-2490-4ce9-a407-f652fb5a329c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "54d8fba9-2490-4ce9-a407-f652fb5a329c" (UID: "54d8fba9-2490-4ce9-a407-f652fb5a329c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.362945 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/54d8fba9-2490-4ce9-a407-f652fb5a329c-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.362984 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kj4p\" (UniqueName: \"kubernetes.io/projected/54d8fba9-2490-4ce9-a407-f652fb5a329c-kube-api-access-9kj4p\") on node \"crc\" DevicePath \"\"" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.362994 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/54d8fba9-2490-4ce9-a407-f652fb5a329c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.446120 4974 generic.go:334] "Generic (PLEG): container finished" podID="54d8fba9-2490-4ce9-a407-f652fb5a329c" containerID="38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9" exitCode=0 Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.446281 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgldx" event={"ID":"54d8fba9-2490-4ce9-a407-f652fb5a329c","Type":"ContainerDied","Data":"38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9"} Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.446486 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zgldx" event={"ID":"54d8fba9-2490-4ce9-a407-f652fb5a329c","Type":"ContainerDied","Data":"4055cda9f3ec9528cb37f30e1396fc814fae0260f71e903bff84803464571a1c"} Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.446523 4974 scope.go:117] "RemoveContainer" containerID="38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.446366 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zgldx" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.471202 4974 scope.go:117] "RemoveContainer" containerID="43406b4dae937a426811f28ceb5c34749ed62f6be841d258e5b1544ac12fe8c1" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.496802 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zgldx"] Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.505125 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zgldx"] Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.518030 4974 scope.go:117] "RemoveContainer" containerID="5d4b986c77262347c350da46780b246e1dcb613701df4c8d0da307ae250da3ca" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.541691 4974 scope.go:117] "RemoveContainer" containerID="38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9" Oct 13 18:27:44 crc kubenswrapper[4974]: E1013 18:27:44.542710 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9\": container with ID starting with 38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9 not found: ID does not exist" containerID="38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.542791 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9"} err="failed to get container status \"38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9\": rpc error: code = NotFound desc = could not find container \"38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9\": container with ID starting with 38d97ff21d5a29d5ad5c32b735ecbf52277319b6299d775dee57ce8047c7b7d9 not found: ID does not exist" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.542851 4974 scope.go:117] "RemoveContainer" containerID="43406b4dae937a426811f28ceb5c34749ed62f6be841d258e5b1544ac12fe8c1" Oct 13 18:27:44 crc kubenswrapper[4974]: E1013 18:27:44.543454 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43406b4dae937a426811f28ceb5c34749ed62f6be841d258e5b1544ac12fe8c1\": container with ID starting with 43406b4dae937a426811f28ceb5c34749ed62f6be841d258e5b1544ac12fe8c1 not found: ID does not exist" containerID="43406b4dae937a426811f28ceb5c34749ed62f6be841d258e5b1544ac12fe8c1" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.543504 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43406b4dae937a426811f28ceb5c34749ed62f6be841d258e5b1544ac12fe8c1"} err="failed to get container status \"43406b4dae937a426811f28ceb5c34749ed62f6be841d258e5b1544ac12fe8c1\": rpc error: code = NotFound desc = could not find container \"43406b4dae937a426811f28ceb5c34749ed62f6be841d258e5b1544ac12fe8c1\": container with ID starting with 43406b4dae937a426811f28ceb5c34749ed62f6be841d258e5b1544ac12fe8c1 not found: ID does not exist" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.543546 4974 scope.go:117] "RemoveContainer" containerID="5d4b986c77262347c350da46780b246e1dcb613701df4c8d0da307ae250da3ca" Oct 13 18:27:44 crc kubenswrapper[4974]: E1013 18:27:44.543900 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4b986c77262347c350da46780b246e1dcb613701df4c8d0da307ae250da3ca\": container with ID starting with 5d4b986c77262347c350da46780b246e1dcb613701df4c8d0da307ae250da3ca not found: ID does not exist" containerID="5d4b986c77262347c350da46780b246e1dcb613701df4c8d0da307ae250da3ca" Oct 13 18:27:44 crc kubenswrapper[4974]: I1013 18:27:44.543950 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4b986c77262347c350da46780b246e1dcb613701df4c8d0da307ae250da3ca"} err="failed to get container status \"5d4b986c77262347c350da46780b246e1dcb613701df4c8d0da307ae250da3ca\": rpc error: code = NotFound desc = could not find container \"5d4b986c77262347c350da46780b246e1dcb613701df4c8d0da307ae250da3ca\": container with ID starting with 5d4b986c77262347c350da46780b246e1dcb613701df4c8d0da307ae250da3ca not found: ID does not exist" Oct 13 18:27:45 crc kubenswrapper[4974]: I1013 18:27:45.457535 4974 generic.go:334] "Generic (PLEG): container finished" podID="86f9637d-b499-415f-a9a4-0ffde5a536ba" containerID="ceb1116d17cd0bcc4334f108f1c1075b0dfb60b505033cdc4560c3781e38269e" exitCode=0 Oct 13 18:27:45 crc kubenswrapper[4974]: I1013 18:27:45.457611 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkcnk" event={"ID":"86f9637d-b499-415f-a9a4-0ffde5a536ba","Type":"ContainerDied","Data":"ceb1116d17cd0bcc4334f108f1c1075b0dfb60b505033cdc4560c3781e38269e"} Oct 13 18:27:45 crc kubenswrapper[4974]: I1013 18:27:45.820806 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54d8fba9-2490-4ce9-a407-f652fb5a329c" path="/var/lib/kubelet/pods/54d8fba9-2490-4ce9-a407-f652fb5a329c/volumes" Oct 13 18:27:46 crc kubenswrapper[4974]: I1013 18:27:46.468232 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkcnk" event={"ID":"86f9637d-b499-415f-a9a4-0ffde5a536ba","Type":"ContainerStarted","Data":"11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d"} Oct 13 18:27:46 crc kubenswrapper[4974]: I1013 18:27:46.496048 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xkcnk" podStartSLOduration=2.089972271 podStartE2EDuration="4.496019047s" podCreationTimestamp="2025-10-13 18:27:42 +0000 UTC" firstStartedPulling="2025-10-13 18:27:43.429641786 +0000 UTC m=+798.334007876" lastFinishedPulling="2025-10-13 18:27:45.835688582 +0000 UTC m=+800.740054652" observedRunningTime="2025-10-13 18:27:46.494388341 +0000 UTC m=+801.398754431" watchObservedRunningTime="2025-10-13 18:27:46.496019047 +0000 UTC m=+801.400385167" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.210955 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6"] Oct 13 18:27:47 crc kubenswrapper[4974]: E1013 18:27:47.211258 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d8fba9-2490-4ce9-a407-f652fb5a329c" containerName="extract-content" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.211274 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d8fba9-2490-4ce9-a407-f652fb5a329c" containerName="extract-content" Oct 13 18:27:47 crc kubenswrapper[4974]: E1013 18:27:47.211287 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d8fba9-2490-4ce9-a407-f652fb5a329c" containerName="registry-server" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.211298 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d8fba9-2490-4ce9-a407-f652fb5a329c" containerName="registry-server" Oct 13 18:27:47 crc kubenswrapper[4974]: E1013 18:27:47.211312 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb93a5ae-368f-4fce-b522-b318fa519ade" containerName="extract" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.211321 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb93a5ae-368f-4fce-b522-b318fa519ade" containerName="extract" Oct 13 18:27:47 crc kubenswrapper[4974]: E1013 18:27:47.211338 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb93a5ae-368f-4fce-b522-b318fa519ade" containerName="util" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.211347 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb93a5ae-368f-4fce-b522-b318fa519ade" containerName="util" Oct 13 18:27:47 crc kubenswrapper[4974]: E1013 18:27:47.211367 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54d8fba9-2490-4ce9-a407-f652fb5a329c" containerName="extract-utilities" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.211375 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="54d8fba9-2490-4ce9-a407-f652fb5a329c" containerName="extract-utilities" Oct 13 18:27:47 crc kubenswrapper[4974]: E1013 18:27:47.211388 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb93a5ae-368f-4fce-b522-b318fa519ade" containerName="pull" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.211395 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb93a5ae-368f-4fce-b522-b318fa519ade" containerName="pull" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.211552 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb93a5ae-368f-4fce-b522-b318fa519ade" containerName="extract" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.211565 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="54d8fba9-2490-4ce9-a407-f652fb5a329c" containerName="registry-server" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.212372 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.217475 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-q59f4" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.270440 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6"] Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.308158 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph7p8\" (UniqueName: \"kubernetes.io/projected/72ade784-9a52-4442-b3e6-044297f70cb7-kube-api-access-ph7p8\") pod \"openstack-operator-controller-operator-8d8df4487-k7bh6\" (UID: \"72ade784-9a52-4442-b3e6-044297f70cb7\") " pod="openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.409198 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph7p8\" (UniqueName: \"kubernetes.io/projected/72ade784-9a52-4442-b3e6-044297f70cb7-kube-api-access-ph7p8\") pod \"openstack-operator-controller-operator-8d8df4487-k7bh6\" (UID: \"72ade784-9a52-4442-b3e6-044297f70cb7\") " pod="openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.436401 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph7p8\" (UniqueName: \"kubernetes.io/projected/72ade784-9a52-4442-b3e6-044297f70cb7-kube-api-access-ph7p8\") pod \"openstack-operator-controller-operator-8d8df4487-k7bh6\" (UID: \"72ade784-9a52-4442-b3e6-044297f70cb7\") " pod="openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.533242 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6" Oct 13 18:27:47 crc kubenswrapper[4974]: I1013 18:27:47.892140 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6"] Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.485134 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6" event={"ID":"72ade784-9a52-4442-b3e6-044297f70cb7","Type":"ContainerStarted","Data":"73796bf522777be8296a96857180ce496d666d95e91a071cd5c6962c2b1bdc91"} Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.634764 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6dkhw"] Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.636677 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.691399 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dkhw"] Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.739620 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbftm\" (UniqueName: \"kubernetes.io/projected/175cbc49-637c-45cd-8a08-cbb21a189cd8-kube-api-access-nbftm\") pod \"redhat-marketplace-6dkhw\" (UID: \"175cbc49-637c-45cd-8a08-cbb21a189cd8\") " pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.739765 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175cbc49-637c-45cd-8a08-cbb21a189cd8-utilities\") pod \"redhat-marketplace-6dkhw\" (UID: \"175cbc49-637c-45cd-8a08-cbb21a189cd8\") " pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.739793 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175cbc49-637c-45cd-8a08-cbb21a189cd8-catalog-content\") pod \"redhat-marketplace-6dkhw\" (UID: \"175cbc49-637c-45cd-8a08-cbb21a189cd8\") " pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.842136 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175cbc49-637c-45cd-8a08-cbb21a189cd8-catalog-content\") pod \"redhat-marketplace-6dkhw\" (UID: \"175cbc49-637c-45cd-8a08-cbb21a189cd8\") " pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.842180 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175cbc49-637c-45cd-8a08-cbb21a189cd8-utilities\") pod \"redhat-marketplace-6dkhw\" (UID: \"175cbc49-637c-45cd-8a08-cbb21a189cd8\") " pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.842206 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbftm\" (UniqueName: \"kubernetes.io/projected/175cbc49-637c-45cd-8a08-cbb21a189cd8-kube-api-access-nbftm\") pod \"redhat-marketplace-6dkhw\" (UID: \"175cbc49-637c-45cd-8a08-cbb21a189cd8\") " pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.843131 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175cbc49-637c-45cd-8a08-cbb21a189cd8-catalog-content\") pod \"redhat-marketplace-6dkhw\" (UID: \"175cbc49-637c-45cd-8a08-cbb21a189cd8\") " pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.843223 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175cbc49-637c-45cd-8a08-cbb21a189cd8-utilities\") pod \"redhat-marketplace-6dkhw\" (UID: \"175cbc49-637c-45cd-8a08-cbb21a189cd8\") " pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:48 crc kubenswrapper[4974]: I1013 18:27:48.879877 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbftm\" (UniqueName: \"kubernetes.io/projected/175cbc49-637c-45cd-8a08-cbb21a189cd8-kube-api-access-nbftm\") pod \"redhat-marketplace-6dkhw\" (UID: \"175cbc49-637c-45cd-8a08-cbb21a189cd8\") " pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:49 crc kubenswrapper[4974]: I1013 18:27:49.003261 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:49 crc kubenswrapper[4974]: I1013 18:27:49.441639 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dkhw"] Oct 13 18:27:52 crc kubenswrapper[4974]: I1013 18:27:52.363547 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:52 crc kubenswrapper[4974]: I1013 18:27:52.364039 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:52 crc kubenswrapper[4974]: I1013 18:27:52.419545 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:52 crc kubenswrapper[4974]: I1013 18:27:52.513504 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dkhw" event={"ID":"175cbc49-637c-45cd-8a08-cbb21a189cd8","Type":"ContainerStarted","Data":"8b05c10c890a556c93c5c2002227b38fb201f3e8aca11b642c367a6313d15ed4"} Oct 13 18:27:52 crc kubenswrapper[4974]: I1013 18:27:52.568048 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:53 crc kubenswrapper[4974]: I1013 18:27:53.525410 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6" event={"ID":"72ade784-9a52-4442-b3e6-044297f70cb7","Type":"ContainerStarted","Data":"eb33d0747b96b90413486914eac1344b75dc56fcf9af3839c7b88676109dfa96"} Oct 13 18:27:53 crc kubenswrapper[4974]: I1013 18:27:53.528167 4974 generic.go:334] "Generic (PLEG): container finished" podID="175cbc49-637c-45cd-8a08-cbb21a189cd8" containerID="e159cc3b0fcec26ad601b5363b0b26ad953e2846e1a380f0988363654f989c28" exitCode=0 Oct 13 18:27:53 crc kubenswrapper[4974]: I1013 18:27:53.528240 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dkhw" event={"ID":"175cbc49-637c-45cd-8a08-cbb21a189cd8","Type":"ContainerDied","Data":"e159cc3b0fcec26ad601b5363b0b26ad953e2846e1a380f0988363654f989c28"} Oct 13 18:27:54 crc kubenswrapper[4974]: I1013 18:27:54.215237 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkcnk"] Oct 13 18:27:54 crc kubenswrapper[4974]: I1013 18:27:54.535083 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xkcnk" podUID="86f9637d-b499-415f-a9a4-0ffde5a536ba" containerName="registry-server" containerID="cri-o://11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d" gracePeriod=2 Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.271217 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.467643 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2phr\" (UniqueName: \"kubernetes.io/projected/86f9637d-b499-415f-a9a4-0ffde5a536ba-kube-api-access-j2phr\") pod \"86f9637d-b499-415f-a9a4-0ffde5a536ba\" (UID: \"86f9637d-b499-415f-a9a4-0ffde5a536ba\") " Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.467696 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86f9637d-b499-415f-a9a4-0ffde5a536ba-utilities\") pod \"86f9637d-b499-415f-a9a4-0ffde5a536ba\" (UID: \"86f9637d-b499-415f-a9a4-0ffde5a536ba\") " Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.467739 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86f9637d-b499-415f-a9a4-0ffde5a536ba-catalog-content\") pod \"86f9637d-b499-415f-a9a4-0ffde5a536ba\" (UID: \"86f9637d-b499-415f-a9a4-0ffde5a536ba\") " Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.470141 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86f9637d-b499-415f-a9a4-0ffde5a536ba-utilities" (OuterVolumeSpecName: "utilities") pod "86f9637d-b499-415f-a9a4-0ffde5a536ba" (UID: "86f9637d-b499-415f-a9a4-0ffde5a536ba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.473241 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86f9637d-b499-415f-a9a4-0ffde5a536ba-kube-api-access-j2phr" (OuterVolumeSpecName: "kube-api-access-j2phr") pod "86f9637d-b499-415f-a9a4-0ffde5a536ba" (UID: "86f9637d-b499-415f-a9a4-0ffde5a536ba"). InnerVolumeSpecName "kube-api-access-j2phr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.547943 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dkhw" event={"ID":"175cbc49-637c-45cd-8a08-cbb21a189cd8","Type":"ContainerStarted","Data":"de148dbc9d42d9bc5622ba517021ad2b2bd2ccd9220383534405d74d3bbaac67"} Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.553034 4974 generic.go:334] "Generic (PLEG): container finished" podID="86f9637d-b499-415f-a9a4-0ffde5a536ba" containerID="11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d" exitCode=0 Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.553089 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkcnk" event={"ID":"86f9637d-b499-415f-a9a4-0ffde5a536ba","Type":"ContainerDied","Data":"11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d"} Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.553111 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xkcnk" event={"ID":"86f9637d-b499-415f-a9a4-0ffde5a536ba","Type":"ContainerDied","Data":"5d76ef10bbccb57155c9434f466699cc502ff9959ae713a58618e240d578179b"} Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.553128 4974 scope.go:117] "RemoveContainer" containerID="11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.553225 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xkcnk" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.556207 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6" event={"ID":"72ade784-9a52-4442-b3e6-044297f70cb7","Type":"ContainerStarted","Data":"eb201f5a46d9e8fb4f32dc53d2b352df6336839686a73127e0a121e1fe676c26"} Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.556956 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.569981 4974 scope.go:117] "RemoveContainer" containerID="ceb1116d17cd0bcc4334f108f1c1075b0dfb60b505033cdc4560c3781e38269e" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.570599 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2phr\" (UniqueName: \"kubernetes.io/projected/86f9637d-b499-415f-a9a4-0ffde5a536ba-kube-api-access-j2phr\") on node \"crc\" DevicePath \"\"" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.570631 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86f9637d-b499-415f-a9a4-0ffde5a536ba-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.587730 4974 scope.go:117] "RemoveContainer" containerID="a9c2b838cb2f00b6ee91a99662cca156c798b78a7f0215d29063fe5018b15eac" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.597020 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6" podStartSLOduration=1.219222607 podStartE2EDuration="8.596996935s" podCreationTimestamp="2025-10-13 18:27:47 +0000 UTC" firstStartedPulling="2025-10-13 18:27:47.902905167 +0000 UTC m=+802.807271247" lastFinishedPulling="2025-10-13 18:27:55.280679495 +0000 UTC m=+810.185045575" observedRunningTime="2025-10-13 18:27:55.592702083 +0000 UTC m=+810.497068183" watchObservedRunningTime="2025-10-13 18:27:55.596996935 +0000 UTC m=+810.501363015" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.608430 4974 scope.go:117] "RemoveContainer" containerID="11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d" Oct 13 18:27:55 crc kubenswrapper[4974]: E1013 18:27:55.608838 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d\": container with ID starting with 11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d not found: ID does not exist" containerID="11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.608879 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d"} err="failed to get container status \"11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d\": rpc error: code = NotFound desc = could not find container \"11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d\": container with ID starting with 11b437c85860a7622dc98364006338b74a5c80d1141b6d4a7a57a09a7781af9d not found: ID does not exist" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.608910 4974 scope.go:117] "RemoveContainer" containerID="ceb1116d17cd0bcc4334f108f1c1075b0dfb60b505033cdc4560c3781e38269e" Oct 13 18:27:55 crc kubenswrapper[4974]: E1013 18:27:55.609345 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceb1116d17cd0bcc4334f108f1c1075b0dfb60b505033cdc4560c3781e38269e\": container with ID starting with ceb1116d17cd0bcc4334f108f1c1075b0dfb60b505033cdc4560c3781e38269e not found: ID does not exist" containerID="ceb1116d17cd0bcc4334f108f1c1075b0dfb60b505033cdc4560c3781e38269e" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.609389 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceb1116d17cd0bcc4334f108f1c1075b0dfb60b505033cdc4560c3781e38269e"} err="failed to get container status \"ceb1116d17cd0bcc4334f108f1c1075b0dfb60b505033cdc4560c3781e38269e\": rpc error: code = NotFound desc = could not find container \"ceb1116d17cd0bcc4334f108f1c1075b0dfb60b505033cdc4560c3781e38269e\": container with ID starting with ceb1116d17cd0bcc4334f108f1c1075b0dfb60b505033cdc4560c3781e38269e not found: ID does not exist" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.609420 4974 scope.go:117] "RemoveContainer" containerID="a9c2b838cb2f00b6ee91a99662cca156c798b78a7f0215d29063fe5018b15eac" Oct 13 18:27:55 crc kubenswrapper[4974]: E1013 18:27:55.609715 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9c2b838cb2f00b6ee91a99662cca156c798b78a7f0215d29063fe5018b15eac\": container with ID starting with a9c2b838cb2f00b6ee91a99662cca156c798b78a7f0215d29063fe5018b15eac not found: ID does not exist" containerID="a9c2b838cb2f00b6ee91a99662cca156c798b78a7f0215d29063fe5018b15eac" Oct 13 18:27:55 crc kubenswrapper[4974]: I1013 18:27:55.609749 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9c2b838cb2f00b6ee91a99662cca156c798b78a7f0215d29063fe5018b15eac"} err="failed to get container status \"a9c2b838cb2f00b6ee91a99662cca156c798b78a7f0215d29063fe5018b15eac\": rpc error: code = NotFound desc = could not find container \"a9c2b838cb2f00b6ee91a99662cca156c798b78a7f0215d29063fe5018b15eac\": container with ID starting with a9c2b838cb2f00b6ee91a99662cca156c798b78a7f0215d29063fe5018b15eac not found: ID does not exist" Oct 13 18:27:56 crc kubenswrapper[4974]: I1013 18:27:56.568974 4974 generic.go:334] "Generic (PLEG): container finished" podID="175cbc49-637c-45cd-8a08-cbb21a189cd8" containerID="de148dbc9d42d9bc5622ba517021ad2b2bd2ccd9220383534405d74d3bbaac67" exitCode=0 Oct 13 18:27:56 crc kubenswrapper[4974]: I1013 18:27:56.569052 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dkhw" event={"ID":"175cbc49-637c-45cd-8a08-cbb21a189cd8","Type":"ContainerDied","Data":"de148dbc9d42d9bc5622ba517021ad2b2bd2ccd9220383534405d74d3bbaac67"} Oct 13 18:27:56 crc kubenswrapper[4974]: I1013 18:27:56.669350 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86f9637d-b499-415f-a9a4-0ffde5a536ba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86f9637d-b499-415f-a9a4-0ffde5a536ba" (UID: "86f9637d-b499-415f-a9a4-0ffde5a536ba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:27:56 crc kubenswrapper[4974]: I1013 18:27:56.687298 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86f9637d-b499-415f-a9a4-0ffde5a536ba-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:27:56 crc kubenswrapper[4974]: I1013 18:27:56.802147 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xkcnk"] Oct 13 18:27:56 crc kubenswrapper[4974]: I1013 18:27:56.810296 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xkcnk"] Oct 13 18:27:57 crc kubenswrapper[4974]: I1013 18:27:57.539325 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-8d8df4487-k7bh6" Oct 13 18:27:57 crc kubenswrapper[4974]: I1013 18:27:57.585746 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dkhw" event={"ID":"175cbc49-637c-45cd-8a08-cbb21a189cd8","Type":"ContainerStarted","Data":"5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c"} Oct 13 18:27:57 crc kubenswrapper[4974]: I1013 18:27:57.615524 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6dkhw" podStartSLOduration=6.088407786 podStartE2EDuration="9.615505427s" podCreationTimestamp="2025-10-13 18:27:48 +0000 UTC" firstStartedPulling="2025-10-13 18:27:53.529984194 +0000 UTC m=+808.434350304" lastFinishedPulling="2025-10-13 18:27:57.057081825 +0000 UTC m=+811.961447945" observedRunningTime="2025-10-13 18:27:57.611286198 +0000 UTC m=+812.515652268" watchObservedRunningTime="2025-10-13 18:27:57.615505427 +0000 UTC m=+812.519871507" Oct 13 18:27:57 crc kubenswrapper[4974]: I1013 18:27:57.822104 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86f9637d-b499-415f-a9a4-0ffde5a536ba" path="/var/lib/kubelet/pods/86f9637d-b499-415f-a9a4-0ffde5a536ba/volumes" Oct 13 18:27:59 crc kubenswrapper[4974]: I1013 18:27:59.004025 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:59 crc kubenswrapper[4974]: I1013 18:27:59.004431 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:27:59 crc kubenswrapper[4974]: I1013 18:27:59.059719 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:28:07 crc kubenswrapper[4974]: I1013 18:28:07.743321 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:28:07 crc kubenswrapper[4974]: I1013 18:28:07.743890 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:28:09 crc kubenswrapper[4974]: I1013 18:28:09.070043 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:28:09 crc kubenswrapper[4974]: I1013 18:28:09.151403 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dkhw"] Oct 13 18:28:09 crc kubenswrapper[4974]: I1013 18:28:09.660728 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6dkhw" podUID="175cbc49-637c-45cd-8a08-cbb21a189cd8" containerName="registry-server" containerID="cri-o://5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c" gracePeriod=2 Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.122134 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.291492 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175cbc49-637c-45cd-8a08-cbb21a189cd8-catalog-content\") pod \"175cbc49-637c-45cd-8a08-cbb21a189cd8\" (UID: \"175cbc49-637c-45cd-8a08-cbb21a189cd8\") " Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.291588 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175cbc49-637c-45cd-8a08-cbb21a189cd8-utilities\") pod \"175cbc49-637c-45cd-8a08-cbb21a189cd8\" (UID: \"175cbc49-637c-45cd-8a08-cbb21a189cd8\") " Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.291643 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbftm\" (UniqueName: \"kubernetes.io/projected/175cbc49-637c-45cd-8a08-cbb21a189cd8-kube-api-access-nbftm\") pod \"175cbc49-637c-45cd-8a08-cbb21a189cd8\" (UID: \"175cbc49-637c-45cd-8a08-cbb21a189cd8\") " Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.292584 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/175cbc49-637c-45cd-8a08-cbb21a189cd8-utilities" (OuterVolumeSpecName: "utilities") pod "175cbc49-637c-45cd-8a08-cbb21a189cd8" (UID: "175cbc49-637c-45cd-8a08-cbb21a189cd8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.304418 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/175cbc49-637c-45cd-8a08-cbb21a189cd8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "175cbc49-637c-45cd-8a08-cbb21a189cd8" (UID: "175cbc49-637c-45cd-8a08-cbb21a189cd8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.314889 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/175cbc49-637c-45cd-8a08-cbb21a189cd8-kube-api-access-nbftm" (OuterVolumeSpecName: "kube-api-access-nbftm") pod "175cbc49-637c-45cd-8a08-cbb21a189cd8" (UID: "175cbc49-637c-45cd-8a08-cbb21a189cd8"). InnerVolumeSpecName "kube-api-access-nbftm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.393563 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/175cbc49-637c-45cd-8a08-cbb21a189cd8-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.393618 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbftm\" (UniqueName: \"kubernetes.io/projected/175cbc49-637c-45cd-8a08-cbb21a189cd8-kube-api-access-nbftm\") on node \"crc\" DevicePath \"\"" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.393633 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/175cbc49-637c-45cd-8a08-cbb21a189cd8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.671349 4974 generic.go:334] "Generic (PLEG): container finished" podID="175cbc49-637c-45cd-8a08-cbb21a189cd8" containerID="5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c" exitCode=0 Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.671410 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6dkhw" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.671449 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dkhw" event={"ID":"175cbc49-637c-45cd-8a08-cbb21a189cd8","Type":"ContainerDied","Data":"5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c"} Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.671932 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6dkhw" event={"ID":"175cbc49-637c-45cd-8a08-cbb21a189cd8","Type":"ContainerDied","Data":"8b05c10c890a556c93c5c2002227b38fb201f3e8aca11b642c367a6313d15ed4"} Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.671957 4974 scope.go:117] "RemoveContainer" containerID="5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.687927 4974 scope.go:117] "RemoveContainer" containerID="de148dbc9d42d9bc5622ba517021ad2b2bd2ccd9220383534405d74d3bbaac67" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.711356 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dkhw"] Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.711600 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6dkhw"] Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.737817 4974 scope.go:117] "RemoveContainer" containerID="e159cc3b0fcec26ad601b5363b0b26ad953e2846e1a380f0988363654f989c28" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.778463 4974 scope.go:117] "RemoveContainer" containerID="5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c" Oct 13 18:28:10 crc kubenswrapper[4974]: E1013 18:28:10.779131 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c\": container with ID starting with 5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c not found: ID does not exist" containerID="5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.779183 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c"} err="failed to get container status \"5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c\": rpc error: code = NotFound desc = could not find container \"5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c\": container with ID starting with 5fb782d99487102d49a80c656412c1303b4d125b346b3c4ffc837c36c2b58d2c not found: ID does not exist" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.779218 4974 scope.go:117] "RemoveContainer" containerID="de148dbc9d42d9bc5622ba517021ad2b2bd2ccd9220383534405d74d3bbaac67" Oct 13 18:28:10 crc kubenswrapper[4974]: E1013 18:28:10.779635 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de148dbc9d42d9bc5622ba517021ad2b2bd2ccd9220383534405d74d3bbaac67\": container with ID starting with de148dbc9d42d9bc5622ba517021ad2b2bd2ccd9220383534405d74d3bbaac67 not found: ID does not exist" containerID="de148dbc9d42d9bc5622ba517021ad2b2bd2ccd9220383534405d74d3bbaac67" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.779681 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de148dbc9d42d9bc5622ba517021ad2b2bd2ccd9220383534405d74d3bbaac67"} err="failed to get container status \"de148dbc9d42d9bc5622ba517021ad2b2bd2ccd9220383534405d74d3bbaac67\": rpc error: code = NotFound desc = could not find container \"de148dbc9d42d9bc5622ba517021ad2b2bd2ccd9220383534405d74d3bbaac67\": container with ID starting with de148dbc9d42d9bc5622ba517021ad2b2bd2ccd9220383534405d74d3bbaac67 not found: ID does not exist" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.779705 4974 scope.go:117] "RemoveContainer" containerID="e159cc3b0fcec26ad601b5363b0b26ad953e2846e1a380f0988363654f989c28" Oct 13 18:28:10 crc kubenswrapper[4974]: E1013 18:28:10.780141 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e159cc3b0fcec26ad601b5363b0b26ad953e2846e1a380f0988363654f989c28\": container with ID starting with e159cc3b0fcec26ad601b5363b0b26ad953e2846e1a380f0988363654f989c28 not found: ID does not exist" containerID="e159cc3b0fcec26ad601b5363b0b26ad953e2846e1a380f0988363654f989c28" Oct 13 18:28:10 crc kubenswrapper[4974]: I1013 18:28:10.780174 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e159cc3b0fcec26ad601b5363b0b26ad953e2846e1a380f0988363654f989c28"} err="failed to get container status \"e159cc3b0fcec26ad601b5363b0b26ad953e2846e1a380f0988363654f989c28\": rpc error: code = NotFound desc = could not find container \"e159cc3b0fcec26ad601b5363b0b26ad953e2846e1a380f0988363654f989c28\": container with ID starting with e159cc3b0fcec26ad601b5363b0b26ad953e2846e1a380f0988363654f989c28 not found: ID does not exist" Oct 13 18:28:11 crc kubenswrapper[4974]: I1013 18:28:11.825914 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="175cbc49-637c-45cd-8a08-cbb21a189cd8" path="/var/lib/kubelet/pods/175cbc49-637c-45cd-8a08-cbb21a189cd8/volumes" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.518761 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv"] Oct 13 18:28:14 crc kubenswrapper[4974]: E1013 18:28:14.519005 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175cbc49-637c-45cd-8a08-cbb21a189cd8" containerName="registry-server" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.519016 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="175cbc49-637c-45cd-8a08-cbb21a189cd8" containerName="registry-server" Oct 13 18:28:14 crc kubenswrapper[4974]: E1013 18:28:14.519025 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f9637d-b499-415f-a9a4-0ffde5a536ba" containerName="extract-utilities" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.519030 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f9637d-b499-415f-a9a4-0ffde5a536ba" containerName="extract-utilities" Oct 13 18:28:14 crc kubenswrapper[4974]: E1013 18:28:14.519044 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175cbc49-637c-45cd-8a08-cbb21a189cd8" containerName="extract-utilities" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.519050 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="175cbc49-637c-45cd-8a08-cbb21a189cd8" containerName="extract-utilities" Oct 13 18:28:14 crc kubenswrapper[4974]: E1013 18:28:14.519057 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f9637d-b499-415f-a9a4-0ffde5a536ba" containerName="registry-server" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.519062 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f9637d-b499-415f-a9a4-0ffde5a536ba" containerName="registry-server" Oct 13 18:28:14 crc kubenswrapper[4974]: E1013 18:28:14.519072 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86f9637d-b499-415f-a9a4-0ffde5a536ba" containerName="extract-content" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.519078 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="86f9637d-b499-415f-a9a4-0ffde5a536ba" containerName="extract-content" Oct 13 18:28:14 crc kubenswrapper[4974]: E1013 18:28:14.519098 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="175cbc49-637c-45cd-8a08-cbb21a189cd8" containerName="extract-content" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.519103 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="175cbc49-637c-45cd-8a08-cbb21a189cd8" containerName="extract-content" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.519225 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="86f9637d-b499-415f-a9a4-0ffde5a536ba" containerName="registry-server" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.519240 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="175cbc49-637c-45cd-8a08-cbb21a189cd8" containerName="registry-server" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.519846 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.524395 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.525010 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-zc4mm" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.525317 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.527907 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-jkpfs" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.537175 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.541512 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.542448 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.546388 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.547627 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-s52q9" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.550778 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.551836 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.558418 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-r65kg" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.559890 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5c56\" (UniqueName: \"kubernetes.io/projected/ef8af802-f6f6-4018-9bfd-f8aee92ff838-kube-api-access-f5c56\") pod \"barbican-operator-controller-manager-64f84fcdbb-hvqbv\" (UID: \"ef8af802-f6f6-4018-9bfd-f8aee92ff838\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.559921 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnf5g\" (UniqueName: \"kubernetes.io/projected/b44da60c-a4d1-406d-abb8-db29314b9e50-kube-api-access-fnf5g\") pod \"cinder-operator-controller-manager-59cdc64769-t2hfb\" (UID: \"b44da60c-a4d1-406d-abb8-db29314b9e50\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.559973 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-468zz\" (UniqueName: \"kubernetes.io/projected/78805c21-d9b5-4f77-a318-fa1dfa26ebc3-kube-api-access-468zz\") pod \"glance-operator-controller-manager-7bb46cd7d-wzvwc\" (UID: \"78805c21-d9b5-4f77-a318-fa1dfa26ebc3\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.560014 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv5wl\" (UniqueName: \"kubernetes.io/projected/758864e5-2a90-496e-b006-dcfaf42c20bb-kube-api-access-rv5wl\") pod \"designate-operator-controller-manager-687df44cdb-h2cmd\" (UID: \"758864e5-2a90-496e-b006-dcfaf42c20bb\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.583401 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.598802 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.599730 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.603765 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.614739 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-d7djq" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.616547 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.617667 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.620725 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.621829 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-rsdhj" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.628809 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.641245 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.659004 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.678217 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5c56\" (UniqueName: \"kubernetes.io/projected/ef8af802-f6f6-4018-9bfd-f8aee92ff838-kube-api-access-f5c56\") pod \"barbican-operator-controller-manager-64f84fcdbb-hvqbv\" (UID: \"ef8af802-f6f6-4018-9bfd-f8aee92ff838\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.678304 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnf5g\" (UniqueName: \"kubernetes.io/projected/b44da60c-a4d1-406d-abb8-db29314b9e50-kube-api-access-fnf5g\") pod \"cinder-operator-controller-manager-59cdc64769-t2hfb\" (UID: \"b44da60c-a4d1-406d-abb8-db29314b9e50\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.678577 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-468zz\" (UniqueName: \"kubernetes.io/projected/78805c21-d9b5-4f77-a318-fa1dfa26ebc3-kube-api-access-468zz\") pod \"glance-operator-controller-manager-7bb46cd7d-wzvwc\" (UID: \"78805c21-d9b5-4f77-a318-fa1dfa26ebc3\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.678640 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv5wl\" (UniqueName: \"kubernetes.io/projected/758864e5-2a90-496e-b006-dcfaf42c20bb-kube-api-access-rv5wl\") pod \"designate-operator-controller-manager-687df44cdb-h2cmd\" (UID: \"758864e5-2a90-496e-b006-dcfaf42c20bb\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.679563 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-99bsm" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.730071 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.733365 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnf5g\" (UniqueName: \"kubernetes.io/projected/b44da60c-a4d1-406d-abb8-db29314b9e50-kube-api-access-fnf5g\") pod \"cinder-operator-controller-manager-59cdc64769-t2hfb\" (UID: \"b44da60c-a4d1-406d-abb8-db29314b9e50\") " pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.739145 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5c56\" (UniqueName: \"kubernetes.io/projected/ef8af802-f6f6-4018-9bfd-f8aee92ff838-kube-api-access-f5c56\") pod \"barbican-operator-controller-manager-64f84fcdbb-hvqbv\" (UID: \"ef8af802-f6f6-4018-9bfd-f8aee92ff838\") " pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.739638 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-468zz\" (UniqueName: \"kubernetes.io/projected/78805c21-d9b5-4f77-a318-fa1dfa26ebc3-kube-api-access-468zz\") pod \"glance-operator-controller-manager-7bb46cd7d-wzvwc\" (UID: \"78805c21-d9b5-4f77-a318-fa1dfa26ebc3\") " pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.740526 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.741431 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv5wl\" (UniqueName: \"kubernetes.io/projected/758864e5-2a90-496e-b006-dcfaf42c20bb-kube-api-access-rv5wl\") pod \"designate-operator-controller-manager-687df44cdb-h2cmd\" (UID: \"758864e5-2a90-496e-b006-dcfaf42c20bb\") " pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.743639 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-q8vwn" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.743896 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.767838 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.780249 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9z56\" (UniqueName: \"kubernetes.io/projected/f332d432-86f0-4c0b-80d6-dba6e2920a81-kube-api-access-w9z56\") pod \"horizon-operator-controller-manager-6d74794d9b-cp96l\" (UID: \"f332d432-86f0-4c0b-80d6-dba6e2920a81\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.780317 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt4hs\" (UniqueName: \"kubernetes.io/projected/50ce5538-ff95-4983-8ff7-3a406b974617-kube-api-access-nt4hs\") pod \"heat-operator-controller-manager-6d9967f8dd-n4n2k\" (UID: \"50ce5538-ff95-4983-8ff7-3a406b974617\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.780362 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc2kx\" (UniqueName: \"kubernetes.io/projected/e5d3e6f8-15bf-4544-b701-da591158af75-kube-api-access-vc2kx\") pod \"ironic-operator-controller-manager-74cb5cbc49-w4lrj\" (UID: \"e5d3e6f8-15bf-4544-b701-da591158af75\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.787192 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.788440 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.795532 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-ttxcr" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.795619 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.800994 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.811348 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.812415 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.815201 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-vttnt" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.836976 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.838040 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.842868 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.844023 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.844915 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.847643 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-nbz2s" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.850486 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.851663 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.856549 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7mxck" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.866930 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.868553 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.869536 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.872159 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.872226 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8kf7h" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.874679 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.877553 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.883771 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc2kx\" (UniqueName: \"kubernetes.io/projected/e5d3e6f8-15bf-4544-b701-da591158af75-kube-api-access-vc2kx\") pod \"ironic-operator-controller-manager-74cb5cbc49-w4lrj\" (UID: \"e5d3e6f8-15bf-4544-b701-da591158af75\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.883848 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtrc2\" (UniqueName: \"kubernetes.io/projected/e152664c-85e7-4854-8960-ee413a7eb3a3-kube-api-access-gtrc2\") pod \"infra-operator-controller-manager-585fc5b659-rzm52\" (UID: \"e152664c-85e7-4854-8960-ee413a7eb3a3\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.883876 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9z56\" (UniqueName: \"kubernetes.io/projected/f332d432-86f0-4c0b-80d6-dba6e2920a81-kube-api-access-w9z56\") pod \"horizon-operator-controller-manager-6d74794d9b-cp96l\" (UID: \"f332d432-86f0-4c0b-80d6-dba6e2920a81\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.883894 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e152664c-85e7-4854-8960-ee413a7eb3a3-cert\") pod \"infra-operator-controller-manager-585fc5b659-rzm52\" (UID: \"e152664c-85e7-4854-8960-ee413a7eb3a3\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.883940 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt4hs\" (UniqueName: \"kubernetes.io/projected/50ce5538-ff95-4983-8ff7-3a406b974617-kube-api-access-nt4hs\") pod \"heat-operator-controller-manager-6d9967f8dd-n4n2k\" (UID: \"50ce5538-ff95-4983-8ff7-3a406b974617\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.892733 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.900708 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.902024 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.911043 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-fgc95" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.924450 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc2kx\" (UniqueName: \"kubernetes.io/projected/e5d3e6f8-15bf-4544-b701-da591158af75-kube-api-access-vc2kx\") pod \"ironic-operator-controller-manager-74cb5cbc49-w4lrj\" (UID: \"e5d3e6f8-15bf-4544-b701-da591158af75\") " pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.925466 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt4hs\" (UniqueName: \"kubernetes.io/projected/50ce5538-ff95-4983-8ff7-3a406b974617-kube-api-access-nt4hs\") pod \"heat-operator-controller-manager-6d9967f8dd-n4n2k\" (UID: \"50ce5538-ff95-4983-8ff7-3a406b974617\") " pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.931784 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9z56\" (UniqueName: \"kubernetes.io/projected/f332d432-86f0-4c0b-80d6-dba6e2920a81-kube-api-access-w9z56\") pod \"horizon-operator-controller-manager-6d74794d9b-cp96l\" (UID: \"f332d432-86f0-4c0b-80d6-dba6e2920a81\") " pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.942288 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.964151 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.969778 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.970252 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.974003 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hlvq9" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.986590 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-st2wm\" (UniqueName: \"kubernetes.io/projected/86f89f48-3e17-4ed9-9cbb-6458223a1864-kube-api-access-st2wm\") pod \"neutron-operator-controller-manager-797d478b46-7p29r\" (UID: \"86f89f48-3e17-4ed9-9cbb-6458223a1864\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.986853 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6c6n\" (UniqueName: \"kubernetes.io/projected/bcad591b-b126-4da8-a21c-636d710329b8-kube-api-access-g6c6n\") pod \"manila-operator-controller-manager-59578bc799-zkfrg\" (UID: \"bcad591b-b126-4da8-a21c-636d710329b8\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.986928 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rtw2\" (UniqueName: \"kubernetes.io/projected/197d51a8-e30e-485c-8e76-bd4ee120da7b-kube-api-access-8rtw2\") pod \"nova-operator-controller-manager-57bb74c7bf-2xgsp\" (UID: \"197d51a8-e30e-485c-8e76-bd4ee120da7b\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.987005 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtrc2\" (UniqueName: \"kubernetes.io/projected/e152664c-85e7-4854-8960-ee413a7eb3a3-kube-api-access-gtrc2\") pod \"infra-operator-controller-manager-585fc5b659-rzm52\" (UID: \"e152664c-85e7-4854-8960-ee413a7eb3a3\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.987077 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e152664c-85e7-4854-8960-ee413a7eb3a3-cert\") pod \"infra-operator-controller-manager-585fc5b659-rzm52\" (UID: \"e152664c-85e7-4854-8960-ee413a7eb3a3\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.987149 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnmcx\" (UniqueName: \"kubernetes.io/projected/923ead90-d60a-431b-9630-693bdc007237-kube-api-access-fnmcx\") pod \"octavia-operator-controller-manager-6d7c7ddf95-zx7qh\" (UID: \"923ead90-d60a-431b-9630-693bdc007237\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.987224 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htt6n\" (UniqueName: \"kubernetes.io/projected/f9ed9202-2a09-42d2-b140-8300e108e36a-kube-api-access-htt6n\") pod \"ovn-operator-controller-manager-869cc7797f-8ftd2\" (UID: \"f9ed9202-2a09-42d2-b140-8300e108e36a\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.987303 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tflgg\" (UniqueName: \"kubernetes.io/projected/2b43f3c2-b280-40e9-9467-181a372011e1-kube-api-access-tflgg\") pod \"keystone-operator-controller-manager-ddb98f99b-hvwzb\" (UID: \"2b43f3c2-b280-40e9-9467-181a372011e1\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.987369 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5744t\" (UniqueName: \"kubernetes.io/projected/c10ae245-c899-4ea9-9edb-d62b176d19cc-kube-api-access-5744t\") pod \"mariadb-operator-controller-manager-5777b4f897-gq4sm\" (UID: \"c10ae245-c899-4ea9-9edb-d62b176d19cc\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm" Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.988780 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2"] Oct 13 18:28:14 crc kubenswrapper[4974]: I1013 18:28:14.994440 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e152664c-85e7-4854-8960-ee413a7eb3a3-cert\") pod \"infra-operator-controller-manager-585fc5b659-rzm52\" (UID: \"e152664c-85e7-4854-8960-ee413a7eb3a3\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.009138 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.015947 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtrc2\" (UniqueName: \"kubernetes.io/projected/e152664c-85e7-4854-8960-ee413a7eb3a3-kube-api-access-gtrc2\") pod \"infra-operator-controller-manager-585fc5b659-rzm52\" (UID: \"e152664c-85e7-4854-8960-ee413a7eb3a3\") " pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.027338 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.029962 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.032326 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-t6vfz" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.033057 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.037240 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.038323 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.042932 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-9jm8n" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.053103 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.072120 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.085232 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.086388 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.091791 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-95blp" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.092287 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6hsw\" (UniqueName: \"kubernetes.io/projected/4cbd873e-490d-4f1c-91cc-4ca45f109d7f-kube-api-access-h6hsw\") pod \"placement-operator-controller-manager-664664cb68-rs4rf\" (UID: \"4cbd873e-490d-4f1c-91cc-4ca45f109d7f\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.092396 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-279b4\" (UniqueName: \"kubernetes.io/projected/d95330d6-c9ed-4fe6-8daa-6ef9495e72ae-kube-api-access-279b4\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w\" (UID: \"d95330d6-c9ed-4fe6-8daa-6ef9495e72ae\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.092442 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-st2wm\" (UniqueName: \"kubernetes.io/projected/86f89f48-3e17-4ed9-9cbb-6458223a1864-kube-api-access-st2wm\") pod \"neutron-operator-controller-manager-797d478b46-7p29r\" (UID: \"86f89f48-3e17-4ed9-9cbb-6458223a1864\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.092463 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6c6n\" (UniqueName: \"kubernetes.io/projected/bcad591b-b126-4da8-a21c-636d710329b8-kube-api-access-g6c6n\") pod \"manila-operator-controller-manager-59578bc799-zkfrg\" (UID: \"bcad591b-b126-4da8-a21c-636d710329b8\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.092487 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rtw2\" (UniqueName: \"kubernetes.io/projected/197d51a8-e30e-485c-8e76-bd4ee120da7b-kube-api-access-8rtw2\") pod \"nova-operator-controller-manager-57bb74c7bf-2xgsp\" (UID: \"197d51a8-e30e-485c-8e76-bd4ee120da7b\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.092520 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t5fr\" (UniqueName: \"kubernetes.io/projected/45bbc336-9feb-40e0-b7a9-92fad85e7396-kube-api-access-2t5fr\") pod \"swift-operator-controller-manager-5f4d5dfdc6-mj7kp\" (UID: \"45bbc336-9feb-40e0-b7a9-92fad85e7396\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.092557 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnmcx\" (UniqueName: \"kubernetes.io/projected/923ead90-d60a-431b-9630-693bdc007237-kube-api-access-fnmcx\") pod \"octavia-operator-controller-manager-6d7c7ddf95-zx7qh\" (UID: \"923ead90-d60a-431b-9630-693bdc007237\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.092581 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htt6n\" (UniqueName: \"kubernetes.io/projected/f9ed9202-2a09-42d2-b140-8300e108e36a-kube-api-access-htt6n\") pod \"ovn-operator-controller-manager-869cc7797f-8ftd2\" (UID: \"f9ed9202-2a09-42d2-b140-8300e108e36a\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.092600 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tflgg\" (UniqueName: \"kubernetes.io/projected/2b43f3c2-b280-40e9-9467-181a372011e1-kube-api-access-tflgg\") pod \"keystone-operator-controller-manager-ddb98f99b-hvwzb\" (UID: \"2b43f3c2-b280-40e9-9467-181a372011e1\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.092619 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5744t\" (UniqueName: \"kubernetes.io/projected/c10ae245-c899-4ea9-9edb-d62b176d19cc-kube-api-access-5744t\") pod \"mariadb-operator-controller-manager-5777b4f897-gq4sm\" (UID: \"c10ae245-c899-4ea9-9edb-d62b176d19cc\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.092642 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95330d6-c9ed-4fe6-8daa-6ef9495e72ae-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w\" (UID: \"d95330d6-c9ed-4fe6-8daa-6ef9495e72ae\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.101338 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.109910 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.111030 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.119057 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-bb2ds" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.148435 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htt6n\" (UniqueName: \"kubernetes.io/projected/f9ed9202-2a09-42d2-b140-8300e108e36a-kube-api-access-htt6n\") pod \"ovn-operator-controller-manager-869cc7797f-8ftd2\" (UID: \"f9ed9202-2a09-42d2-b140-8300e108e36a\") " pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.158021 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6c6n\" (UniqueName: \"kubernetes.io/projected/bcad591b-b126-4da8-a21c-636d710329b8-kube-api-access-g6c6n\") pod \"manila-operator-controller-manager-59578bc799-zkfrg\" (UID: \"bcad591b-b126-4da8-a21c-636d710329b8\") " pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.158128 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.170954 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.175910 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tflgg\" (UniqueName: \"kubernetes.io/projected/2b43f3c2-b280-40e9-9467-181a372011e1-kube-api-access-tflgg\") pod \"keystone-operator-controller-manager-ddb98f99b-hvwzb\" (UID: \"2b43f3c2-b280-40e9-9467-181a372011e1\") " pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.184851 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5744t\" (UniqueName: \"kubernetes.io/projected/c10ae245-c899-4ea9-9edb-d62b176d19cc-kube-api-access-5744t\") pod \"mariadb-operator-controller-manager-5777b4f897-gq4sm\" (UID: \"c10ae245-c899-4ea9-9edb-d62b176d19cc\") " pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.192796 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rtw2\" (UniqueName: \"kubernetes.io/projected/197d51a8-e30e-485c-8e76-bd4ee120da7b-kube-api-access-8rtw2\") pod \"nova-operator-controller-manager-57bb74c7bf-2xgsp\" (UID: \"197d51a8-e30e-485c-8e76-bd4ee120da7b\") " pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.193288 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnmcx\" (UniqueName: \"kubernetes.io/projected/923ead90-d60a-431b-9630-693bdc007237-kube-api-access-fnmcx\") pod \"octavia-operator-controller-manager-6d7c7ddf95-zx7qh\" (UID: \"923ead90-d60a-431b-9630-693bdc007237\") " pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.193305 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-st2wm\" (UniqueName: \"kubernetes.io/projected/86f89f48-3e17-4ed9-9cbb-6458223a1864-kube-api-access-st2wm\") pod \"neutron-operator-controller-manager-797d478b46-7p29r\" (UID: \"86f89f48-3e17-4ed9-9cbb-6458223a1864\") " pod="openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.193509 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t5fr\" (UniqueName: \"kubernetes.io/projected/45bbc336-9feb-40e0-b7a9-92fad85e7396-kube-api-access-2t5fr\") pod \"swift-operator-controller-manager-5f4d5dfdc6-mj7kp\" (UID: \"45bbc336-9feb-40e0-b7a9-92fad85e7396\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.193602 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95330d6-c9ed-4fe6-8daa-6ef9495e72ae-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w\" (UID: \"d95330d6-c9ed-4fe6-8daa-6ef9495e72ae\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.193643 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6hsw\" (UniqueName: \"kubernetes.io/projected/4cbd873e-490d-4f1c-91cc-4ca45f109d7f-kube-api-access-h6hsw\") pod \"placement-operator-controller-manager-664664cb68-rs4rf\" (UID: \"4cbd873e-490d-4f1c-91cc-4ca45f109d7f\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.193749 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-km6j8\" (UniqueName: \"kubernetes.io/projected/7269886e-6ad1-43fe-a8f2-c535dffe836c-kube-api-access-km6j8\") pod \"telemetry-operator-controller-manager-578874c84d-2q9c8\" (UID: \"7269886e-6ad1-43fe-a8f2-c535dffe836c\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.193803 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-279b4\" (UniqueName: \"kubernetes.io/projected/d95330d6-c9ed-4fe6-8daa-6ef9495e72ae-kube-api-access-279b4\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w\" (UID: \"d95330d6-c9ed-4fe6-8daa-6ef9495e72ae\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" Oct 13 18:28:15 crc kubenswrapper[4974]: E1013 18:28:15.193831 4974 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 13 18:28:15 crc kubenswrapper[4974]: E1013 18:28:15.193886 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d95330d6-c9ed-4fe6-8daa-6ef9495e72ae-cert podName:d95330d6-c9ed-4fe6-8daa-6ef9495e72ae nodeName:}" failed. No retries permitted until 2025-10-13 18:28:15.693870527 +0000 UTC m=+830.598236607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d95330d6-c9ed-4fe6-8daa-6ef9495e72ae-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" (UID: "d95330d6-c9ed-4fe6-8daa-6ef9495e72ae") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.203242 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.205227 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.211643 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.212032 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mwlzf" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.217976 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.229866 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.231256 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.236029 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.240064 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5ktvp" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.240424 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t5fr\" (UniqueName: \"kubernetes.io/projected/45bbc336-9feb-40e0-b7a9-92fad85e7396-kube-api-access-2t5fr\") pod \"swift-operator-controller-manager-5f4d5dfdc6-mj7kp\" (UID: \"45bbc336-9feb-40e0-b7a9-92fad85e7396\") " pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.249206 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.250031 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6hsw\" (UniqueName: \"kubernetes.io/projected/4cbd873e-490d-4f1c-91cc-4ca45f109d7f-kube-api-access-h6hsw\") pod \"placement-operator-controller-manager-664664cb68-rs4rf\" (UID: \"4cbd873e-490d-4f1c-91cc-4ca45f109d7f\") " pod="openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.250724 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.251364 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-279b4\" (UniqueName: \"kubernetes.io/projected/d95330d6-c9ed-4fe6-8daa-6ef9495e72ae-kube-api-access-279b4\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w\" (UID: \"d95330d6-c9ed-4fe6-8daa-6ef9495e72ae\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.252469 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.253026 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-mzkwz" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.253339 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.261447 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.268161 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.275743 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.284996 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.295114 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8nw5\" (UniqueName: \"kubernetes.io/projected/92f85149-41d6-471d-8d77-25fdafb20ca2-kube-api-access-h8nw5\") pod \"openstack-operator-controller-manager-7995b9c57f-x4jst\" (UID: \"92f85149-41d6-471d-8d77-25fdafb20ca2\") " pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.295175 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp7dt\" (UniqueName: \"kubernetes.io/projected/7bb571d8-3894-46f5-a627-932b5dfdc2fd-kube-api-access-fp7dt\") pod \"watcher-operator-controller-manager-6f64d8b78-d6wjd\" (UID: \"7bb571d8-3894-46f5-a627-932b5dfdc2fd\") " pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.295256 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-km6j8\" (UniqueName: \"kubernetes.io/projected/7269886e-6ad1-43fe-a8f2-c535dffe836c-kube-api-access-km6j8\") pod \"telemetry-operator-controller-manager-578874c84d-2q9c8\" (UID: \"7269886e-6ad1-43fe-a8f2-c535dffe836c\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.295285 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92f85149-41d6-471d-8d77-25fdafb20ca2-cert\") pod \"openstack-operator-controller-manager-7995b9c57f-x4jst\" (UID: \"92f85149-41d6-471d-8d77-25fdafb20ca2\") " pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.295317 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvq9q\" (UniqueName: \"kubernetes.io/projected/e6e02a94-3239-4e8b-8d87-4adb4ebcc98b-kube-api-access-pvq9q\") pod \"test-operator-controller-manager-ffcdd6c94-lb8ln\" (UID: \"e6e02a94-3239-4e8b-8d87-4adb4ebcc98b\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.313582 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.314068 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.320977 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.329308 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-km6j8\" (UniqueName: \"kubernetes.io/projected/7269886e-6ad1-43fe-a8f2-c535dffe836c-kube-api-access-km6j8\") pod \"telemetry-operator-controller-manager-578874c84d-2q9c8\" (UID: \"7269886e-6ad1-43fe-a8f2-c535dffe836c\") " pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.335991 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-scqj9"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.337973 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-scqj9" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.340089 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-8ql7z" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.348906 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-scqj9"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.397775 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8nw5\" (UniqueName: \"kubernetes.io/projected/92f85149-41d6-471d-8d77-25fdafb20ca2-kube-api-access-h8nw5\") pod \"openstack-operator-controller-manager-7995b9c57f-x4jst\" (UID: \"92f85149-41d6-471d-8d77-25fdafb20ca2\") " pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.397833 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp7dt\" (UniqueName: \"kubernetes.io/projected/7bb571d8-3894-46f5-a627-932b5dfdc2fd-kube-api-access-fp7dt\") pod \"watcher-operator-controller-manager-6f64d8b78-d6wjd\" (UID: \"7bb571d8-3894-46f5-a627-932b5dfdc2fd\") " pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.397891 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfqp5\" (UniqueName: \"kubernetes.io/projected/9a259044-9901-4a97-89f7-965118976af7-kube-api-access-tfqp5\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-scqj9\" (UID: \"9a259044-9901-4a97-89f7-965118976af7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-scqj9" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.397937 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92f85149-41d6-471d-8d77-25fdafb20ca2-cert\") pod \"openstack-operator-controller-manager-7995b9c57f-x4jst\" (UID: \"92f85149-41d6-471d-8d77-25fdafb20ca2\") " pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.397966 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvq9q\" (UniqueName: \"kubernetes.io/projected/e6e02a94-3239-4e8b-8d87-4adb4ebcc98b-kube-api-access-pvq9q\") pod \"test-operator-controller-manager-ffcdd6c94-lb8ln\" (UID: \"e6e02a94-3239-4e8b-8d87-4adb4ebcc98b\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" Oct 13 18:28:15 crc kubenswrapper[4974]: E1013 18:28:15.398810 4974 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 13 18:28:15 crc kubenswrapper[4974]: E1013 18:28:15.398880 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92f85149-41d6-471d-8d77-25fdafb20ca2-cert podName:92f85149-41d6-471d-8d77-25fdafb20ca2 nodeName:}" failed. No retries permitted until 2025-10-13 18:28:15.898863927 +0000 UTC m=+830.803230007 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92f85149-41d6-471d-8d77-25fdafb20ca2-cert") pod "openstack-operator-controller-manager-7995b9c57f-x4jst" (UID: "92f85149-41d6-471d-8d77-25fdafb20ca2") : secret "webhook-server-cert" not found Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.413021 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.422581 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp7dt\" (UniqueName: \"kubernetes.io/projected/7bb571d8-3894-46f5-a627-932b5dfdc2fd-kube-api-access-fp7dt\") pod \"watcher-operator-controller-manager-6f64d8b78-d6wjd\" (UID: \"7bb571d8-3894-46f5-a627-932b5dfdc2fd\") " pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.424932 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvq9q\" (UniqueName: \"kubernetes.io/projected/e6e02a94-3239-4e8b-8d87-4adb4ebcc98b-kube-api-access-pvq9q\") pod \"test-operator-controller-manager-ffcdd6c94-lb8ln\" (UID: \"e6e02a94-3239-4e8b-8d87-4adb4ebcc98b\") " pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.431072 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8nw5\" (UniqueName: \"kubernetes.io/projected/92f85149-41d6-471d-8d77-25fdafb20ca2-kube-api-access-h8nw5\") pod \"openstack-operator-controller-manager-7995b9c57f-x4jst\" (UID: \"92f85149-41d6-471d-8d77-25fdafb20ca2\") " pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.440960 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.499231 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfqp5\" (UniqueName: \"kubernetes.io/projected/9a259044-9901-4a97-89f7-965118976af7-kube-api-access-tfqp5\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-scqj9\" (UID: \"9a259044-9901-4a97-89f7-965118976af7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-scqj9" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.517270 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfqp5\" (UniqueName: \"kubernetes.io/projected/9a259044-9901-4a97-89f7-965118976af7-kube-api-access-tfqp5\") pod \"rabbitmq-cluster-operator-manager-5f97d8c699-scqj9\" (UID: \"9a259044-9901-4a97-89f7-965118976af7\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-scqj9" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.612645 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.634939 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.668992 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.706371 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95330d6-c9ed-4fe6-8daa-6ef9495e72ae-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w\" (UID: \"d95330d6-c9ed-4fe6-8daa-6ef9495e72ae\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" Oct 13 18:28:15 crc kubenswrapper[4974]: E1013 18:28:15.706517 4974 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 13 18:28:15 crc kubenswrapper[4974]: E1013 18:28:15.706568 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d95330d6-c9ed-4fe6-8daa-6ef9495e72ae-cert podName:d95330d6-c9ed-4fe6-8daa-6ef9495e72ae nodeName:}" failed. No retries permitted until 2025-10-13 18:28:16.706551672 +0000 UTC m=+831.610917752 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d95330d6-c9ed-4fe6-8daa-6ef9495e72ae-cert") pod "openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" (UID: "d95330d6-c9ed-4fe6-8daa-6ef9495e72ae") : secret "openstack-baremetal-operator-webhook-server-cert" not found Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.713963 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-scqj9" Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.767354 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv"] Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.799490 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb"] Oct 13 18:28:15 crc kubenswrapper[4974]: W1013 18:28:15.865326 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef8af802_f6f6_4018_9bfd_f8aee92ff838.slice/crio-43e77a09271b67f1b5a0275932aa715646ecd800ec925d49f3df09afa3f1324a WatchSource:0}: Error finding container 43e77a09271b67f1b5a0275932aa715646ecd800ec925d49f3df09afa3f1324a: Status 404 returned error can't find the container with id 43e77a09271b67f1b5a0275932aa715646ecd800ec925d49f3df09afa3f1324a Oct 13 18:28:15 crc kubenswrapper[4974]: I1013 18:28:15.929418 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92f85149-41d6-471d-8d77-25fdafb20ca2-cert\") pod \"openstack-operator-controller-manager-7995b9c57f-x4jst\" (UID: \"92f85149-41d6-471d-8d77-25fdafb20ca2\") " pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" Oct 13 18:28:15 crc kubenswrapper[4974]: E1013 18:28:15.931300 4974 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Oct 13 18:28:15 crc kubenswrapper[4974]: E1013 18:28:15.931337 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/92f85149-41d6-471d-8d77-25fdafb20ca2-cert podName:92f85149-41d6-471d-8d77-25fdafb20ca2 nodeName:}" failed. No retries permitted until 2025-10-13 18:28:16.931324703 +0000 UTC m=+831.835690783 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/92f85149-41d6-471d-8d77-25fdafb20ca2-cert") pod "openstack-operator-controller-manager-7995b9c57f-x4jst" (UID: "92f85149-41d6-471d-8d77-25fdafb20ca2") : secret "webhook-server-cert" not found Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.138001 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj"] Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.172204 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l"] Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.198587 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc"] Oct 13 18:28:16 crc kubenswrapper[4974]: W1013 18:28:16.323642 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86f89f48_3e17_4ed9_9cbb_6458223a1864.slice/crio-d1611708ffbed36dbb2a09b038e603dfeecaff418a9610ab9bd336c013029dc2 WatchSource:0}: Error finding container d1611708ffbed36dbb2a09b038e603dfeecaff418a9610ab9bd336c013029dc2: Status 404 returned error can't find the container with id d1611708ffbed36dbb2a09b038e603dfeecaff418a9610ab9bd336c013029dc2 Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.324118 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd"] Oct 13 18:28:16 crc kubenswrapper[4974]: W1013 18:28:16.324257 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod758864e5_2a90_496e_b006_dcfaf42c20bb.slice/crio-080b4e6d19981f32e47d8d31c20a95a50a7dba6c5926f6c6c68b3e690557822f WatchSource:0}: Error finding container 080b4e6d19981f32e47d8d31c20a95a50a7dba6c5926f6c6c68b3e690557822f: Status 404 returned error can't find the container with id 080b4e6d19981f32e47d8d31c20a95a50a7dba6c5926f6c6c68b3e690557822f Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.332229 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r"] Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.511590 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp"] Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.541133 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52"] Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.552064 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k"] Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.556875 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf"] Oct 13 18:28:16 crc kubenswrapper[4974]: W1013 18:28:16.599917 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cbd873e_490d_4f1c_91cc_4ca45f109d7f.slice/crio-4c8fcce68e1c728fad75033ce10c38e9d4e3bac4441bd8b2083947a428919054 WatchSource:0}: Error finding container 4c8fcce68e1c728fad75033ce10c38e9d4e3bac4441bd8b2083947a428919054: Status 404 returned error can't find the container with id 4c8fcce68e1c728fad75033ce10c38e9d4e3bac4441bd8b2083947a428919054 Oct 13 18:28:16 crc kubenswrapper[4974]: W1013 18:28:16.600255 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode152664c_85e7_4854_8960_ee413a7eb3a3.slice/crio-cb1dd5ada90e6685f3232268bbbce17eed4cbb90f775aa574e59adc464563dbd WatchSource:0}: Error finding container cb1dd5ada90e6685f3232268bbbce17eed4cbb90f775aa574e59adc464563dbd: Status 404 returned error can't find the container with id cb1dd5ada90e6685f3232268bbbce17eed4cbb90f775aa574e59adc464563dbd Oct 13 18:28:16 crc kubenswrapper[4974]: W1013 18:28:16.606062 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50ce5538_ff95_4983_8ff7_3a406b974617.slice/crio-074653e4ae193f298aeb39c10995d43177a14273731fc8b57f6f3f1061a103c3 WatchSource:0}: Error finding container 074653e4ae193f298aeb39c10995d43177a14273731fc8b57f6f3f1061a103c3: Status 404 returned error can't find the container with id 074653e4ae193f298aeb39c10995d43177a14273731fc8b57f6f3f1061a103c3 Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.634822 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm"] Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.647746 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg"] Oct 13 18:28:16 crc kubenswrapper[4974]: W1013 18:28:16.649840 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc10ae245_c899_4ea9_9edb_d62b176d19cc.slice/crio-51a4c1cdd7c1e6b2608d24fb2e32d4571430617d21fcb4b9c3f4ec2b7f100e00 WatchSource:0}: Error finding container 51a4c1cdd7c1e6b2608d24fb2e32d4571430617d21fcb4b9c3f4ec2b7f100e00: Status 404 returned error can't find the container with id 51a4c1cdd7c1e6b2608d24fb2e32d4571430617d21fcb4b9c3f4ec2b7f100e00 Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.652236 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2"] Oct 13 18:28:16 crc kubenswrapper[4974]: W1013 18:28:16.663884 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcad591b_b126_4da8_a21c_636d710329b8.slice/crio-be687103f5a85d4a98a43dcfb8af4f34136c95616fb5b9cb5f7a39c13250b5fc WatchSource:0}: Error finding container be687103f5a85d4a98a43dcfb8af4f34136c95616fb5b9cb5f7a39c13250b5fc: Status 404 returned error can't find the container with id be687103f5a85d4a98a43dcfb8af4f34136c95616fb5b9cb5f7a39c13250b5fc Oct 13 18:28:16 crc kubenswrapper[4974]: W1013 18:28:16.667446 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9ed9202_2a09_42d2_b140_8300e108e36a.slice/crio-a923739aaf3ddf2562606a2cf5784f9bc353c84e0d81581906ab7fdba005b254 WatchSource:0}: Error finding container a923739aaf3ddf2562606a2cf5784f9bc353c84e0d81581906ab7fdba005b254: Status 404 returned error can't find the container with id a923739aaf3ddf2562606a2cf5784f9bc353c84e0d81581906ab7fdba005b254 Oct 13 18:28:16 crc kubenswrapper[4974]: E1013 18:28:16.681950 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g6c6n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-59578bc799-zkfrg_openstack-operators(bcad591b-b126-4da8-a21c-636d710329b8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.749792 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95330d6-c9ed-4fe6-8daa-6ef9495e72ae-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w\" (UID: \"d95330d6-c9ed-4fe6-8daa-6ef9495e72ae\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.756615 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95330d6-c9ed-4fe6-8daa-6ef9495e72ae-cert\") pod \"openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w\" (UID: \"d95330d6-c9ed-4fe6-8daa-6ef9495e72ae\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.795196 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" event={"ID":"bcad591b-b126-4da8-a21c-636d710329b8","Type":"ContainerStarted","Data":"be687103f5a85d4a98a43dcfb8af4f34136c95616fb5b9cb5f7a39c13250b5fc"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.796581 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" event={"ID":"78805c21-d9b5-4f77-a318-fa1dfa26ebc3","Type":"ContainerStarted","Data":"d8fe58a9853e33734452ff6b362bd05da2354434340c869972182952d5e45ce3"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.798646 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" event={"ID":"b44da60c-a4d1-406d-abb8-db29314b9e50","Type":"ContainerStarted","Data":"605f5041f57d6652a32f7ebfcdd394057b8055ffbf2b4d6a71b3a25d7a4a1dd7"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.799563 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l" event={"ID":"f332d432-86f0-4c0b-80d6-dba6e2920a81","Type":"ContainerStarted","Data":"ff3d673a59ce0dd7d82c5b0784ce4b0827107b848f2746c0329d090374dc1496"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.800683 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r" event={"ID":"86f89f48-3e17-4ed9-9cbb-6458223a1864","Type":"ContainerStarted","Data":"d1611708ffbed36dbb2a09b038e603dfeecaff418a9610ab9bd336c013029dc2"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.803932 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" event={"ID":"758864e5-2a90-496e-b006-dcfaf42c20bb","Type":"ContainerStarted","Data":"080b4e6d19981f32e47d8d31c20a95a50a7dba6c5926f6c6c68b3e690557822f"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.805363 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp" event={"ID":"197d51a8-e30e-485c-8e76-bd4ee120da7b","Type":"ContainerStarted","Data":"3da61643da0bfe12c966fa6012378008d43b44abdb567f883b04bd7f0ca32e75"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.806660 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf" event={"ID":"4cbd873e-490d-4f1c-91cc-4ca45f109d7f","Type":"ContainerStarted","Data":"4c8fcce68e1c728fad75033ce10c38e9d4e3bac4441bd8b2083947a428919054"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.807827 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm" event={"ID":"c10ae245-c899-4ea9-9edb-d62b176d19cc","Type":"ContainerStarted","Data":"51a4c1cdd7c1e6b2608d24fb2e32d4571430617d21fcb4b9c3f4ec2b7f100e00"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.808839 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" event={"ID":"e152664c-85e7-4854-8960-ee413a7eb3a3","Type":"ContainerStarted","Data":"cb1dd5ada90e6685f3232268bbbce17eed4cbb90f775aa574e59adc464563dbd"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.810197 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv" event={"ID":"ef8af802-f6f6-4018-9bfd-f8aee92ff838","Type":"ContainerStarted","Data":"43e77a09271b67f1b5a0275932aa715646ecd800ec925d49f3df09afa3f1324a"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.811169 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2" event={"ID":"f9ed9202-2a09-42d2-b140-8300e108e36a","Type":"ContainerStarted","Data":"a923739aaf3ddf2562606a2cf5784f9bc353c84e0d81581906ab7fdba005b254"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.812140 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj" event={"ID":"e5d3e6f8-15bf-4544-b701-da591158af75","Type":"ContainerStarted","Data":"9bedbe45c9df2e44022f45435fcd6f3b92cd0b3fc1ddf66ed4839ce117b32143"} Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.813797 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k" event={"ID":"50ce5538-ff95-4983-8ff7-3a406b974617","Type":"ContainerStarted","Data":"074653e4ae193f298aeb39c10995d43177a14273731fc8b57f6f3f1061a103c3"} Oct 13 18:28:16 crc kubenswrapper[4974]: E1013 18:28:16.872285 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" podUID="bcad591b-b126-4da8-a21c-636d710329b8" Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.880291 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.943290 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-scqj9"] Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.951975 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92f85149-41d6-471d-8d77-25fdafb20ca2-cert\") pod \"openstack-operator-controller-manager-7995b9c57f-x4jst\" (UID: \"92f85149-41d6-471d-8d77-25fdafb20ca2\") " pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.958520 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln"] Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.971879 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb"] Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.976056 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/92f85149-41d6-471d-8d77-25fdafb20ca2-cert\") pod \"openstack-operator-controller-manager-7995b9c57f-x4jst\" (UID: \"92f85149-41d6-471d-8d77-25fdafb20ca2\") " pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.977924 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh"] Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.987087 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp"] Oct 13 18:28:16 crc kubenswrapper[4974]: I1013 18:28:16.992845 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd"] Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.001027 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8"] Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.023455 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2t5fr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f4d5dfdc6-mj7kp_openstack-operators(45bbc336-9feb-40e0-b7a9-92fad85e7396): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.043361 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pvq9q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-ffcdd6c94-lb8ln_openstack-operators(e6e02a94-3239-4e8b-8d87-4adb4ebcc98b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.048021 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-km6j8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-578874c84d-2q9c8_openstack-operators(7269886e-6ad1-43fe-a8f2-c535dffe836c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.048201 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.119:5001/openstack-k8s-operators/watcher-operator:adafee99798f561d2ca75683d23efd881833abf2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fp7dt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6f64d8b78-d6wjd_openstack-operators(7bb571d8-3894-46f5-a627-932b5dfdc2fd): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 18:28:17 crc kubenswrapper[4974]: W1013 18:28:17.093664 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b43f3c2_b280_40e9_9467_181a372011e1.slice/crio-e8cdcd413b63257e3ca5a7d21b05bcb131e6b9935b1c87212fd6420cc1f138ed WatchSource:0}: Error finding container e8cdcd413b63257e3ca5a7d21b05bcb131e6b9935b1c87212fd6420cc1f138ed: Status 404 returned error can't find the container with id e8cdcd413b63257e3ca5a7d21b05bcb131e6b9935b1c87212fd6420cc1f138ed Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.096808 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:79b43a69884631c635d2164b95a2d4ec68f5cb33f96da14764f1c710880f3997,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tflgg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-ddb98f99b-hvwzb_openstack-operators(2b43f3c2-b280-40e9-9467-181a372011e1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.192103 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.264268 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" podUID="7bb571d8-3894-46f5-a627-932b5dfdc2fd" Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.302023 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" podUID="e6e02a94-3239-4e8b-8d87-4adb4ebcc98b" Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.318137 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" podUID="2b43f3c2-b280-40e9-9467-181a372011e1" Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.318245 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" podUID="45bbc336-9feb-40e0-b7a9-92fad85e7396" Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.425034 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" podUID="7269886e-6ad1-43fe-a8f2-c535dffe836c" Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.616076 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w"] Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.828630 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" event={"ID":"d95330d6-c9ed-4fe6-8daa-6ef9495e72ae","Type":"ContainerStarted","Data":"5304a0b35f483f8cb3796ecd32ce0600485ac6115d2c95f2338300b405c8a656"} Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.830365 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-scqj9" event={"ID":"9a259044-9901-4a97-89f7-965118976af7","Type":"ContainerStarted","Data":"09055eab4ac35f79364b9a9dc138c64cf25f9ba1db2ad1d9115aeb0f76f48f95"} Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.837169 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" event={"ID":"45bbc336-9feb-40e0-b7a9-92fad85e7396","Type":"ContainerStarted","Data":"615ae5fd39bba2f3b4b99758f5d31009aa053e20d67a3ada677eb600c147361d"} Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.837212 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" event={"ID":"45bbc336-9feb-40e0-b7a9-92fad85e7396","Type":"ContainerStarted","Data":"4797ed32518fd4fc0db800d90fab3b776aa06a124abc4af16cd1bd4e75d84ac4"} Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.840870 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh" event={"ID":"923ead90-d60a-431b-9630-693bdc007237","Type":"ContainerStarted","Data":"40341b1781f35d4a9c814212cb25ef971b1c0a2598e6d917252233159be21613"} Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.843471 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" podUID="45bbc336-9feb-40e0-b7a9-92fad85e7396" Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.853520 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" event={"ID":"7269886e-6ad1-43fe-a8f2-c535dffe836c","Type":"ContainerStarted","Data":"0df3931b2be81351be49e7b0a71169cad68c1e4292da7c33f40ec94ac6979000"} Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.853563 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" event={"ID":"7269886e-6ad1-43fe-a8f2-c535dffe836c","Type":"ContainerStarted","Data":"ec93b8170492525ea86f8be3bcb5a4d8cd94d469ba60ad0e4078d1df4e68f4f4"} Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.859118 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" podUID="7269886e-6ad1-43fe-a8f2-c535dffe836c" Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.872976 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" event={"ID":"2b43f3c2-b280-40e9-9467-181a372011e1","Type":"ContainerStarted","Data":"5ca917c2e3eddee863e0950d34bdc4b6fe3301186806bc0e81c28435ed46a629"} Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.873355 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" event={"ID":"2b43f3c2-b280-40e9-9467-181a372011e1","Type":"ContainerStarted","Data":"e8cdcd413b63257e3ca5a7d21b05bcb131e6b9935b1c87212fd6420cc1f138ed"} Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.874548 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" event={"ID":"bcad591b-b126-4da8-a21c-636d710329b8","Type":"ContainerStarted","Data":"67dc433a15843727fba18ed1534b96e58dba476225f08c0aa0e235185232867f"} Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.877552 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" event={"ID":"7bb571d8-3894-46f5-a627-932b5dfdc2fd","Type":"ContainerStarted","Data":"740de5d2a900233fba6ff709ee005473caae55b02055a60e16d1e986e0b18346"} Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.877593 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" event={"ID":"7bb571d8-3894-46f5-a627-932b5dfdc2fd","Type":"ContainerStarted","Data":"18eadc141de2132eefd250a87bee21cadb4afd486bf0661b8f04d2bba004bef0"} Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.880748 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" event={"ID":"e6e02a94-3239-4e8b-8d87-4adb4ebcc98b","Type":"ContainerStarted","Data":"7c4774ed8115438ceb702e46835989ba0cd95bdee737a531a4926918f3ac9fa4"} Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.880769 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" event={"ID":"e6e02a94-3239-4e8b-8d87-4adb4ebcc98b","Type":"ContainerStarted","Data":"ac8ba4f16ac9d07f20da8a3f533d20c0a28bfa3a3fcf22e99062cdc9670ab2d0"} Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.898956 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" podUID="e6e02a94-3239-4e8b-8d87-4adb4ebcc98b" Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.899219 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:79b43a69884631c635d2164b95a2d4ec68f5cb33f96da14764f1c710880f3997\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" podUID="2b43f3c2-b280-40e9-9467-181a372011e1" Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.899277 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" podUID="bcad591b-b126-4da8-a21c-636d710329b8" Oct 13 18:28:17 crc kubenswrapper[4974]: E1013 18:28:17.899317 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.119:5001/openstack-k8s-operators/watcher-operator:adafee99798f561d2ca75683d23efd881833abf2\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" podUID="7bb571d8-3894-46f5-a627-932b5dfdc2fd" Oct 13 18:28:17 crc kubenswrapper[4974]: I1013 18:28:17.968632 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst"] Oct 13 18:28:18 crc kubenswrapper[4974]: I1013 18:28:18.910720 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" event={"ID":"92f85149-41d6-471d-8d77-25fdafb20ca2","Type":"ContainerStarted","Data":"af167d10f7a2deeb7d0242d20a89a1962431aa44cc3fbf23b920abdad1cbec7e"} Oct 13 18:28:18 crc kubenswrapper[4974]: I1013 18:28:18.911056 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" event={"ID":"92f85149-41d6-471d-8d77-25fdafb20ca2","Type":"ContainerStarted","Data":"6ee7dfca94adb43baf3b00335611641facccf13301427bc8666bda35cafa55d3"} Oct 13 18:28:18 crc kubenswrapper[4974]: I1013 18:28:18.911067 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" event={"ID":"92f85149-41d6-471d-8d77-25fdafb20ca2","Type":"ContainerStarted","Data":"33c6796f9dbc3940a4fd3ee7e75bf7cc965e165b80acf61551815c5fb75ec350"} Oct 13 18:28:18 crc kubenswrapper[4974]: I1013 18:28:18.911402 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" Oct 13 18:28:18 crc kubenswrapper[4974]: E1013 18:28:18.925244 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:7e584b1c430441c8b6591dadeff32e065de8a185ad37ef90d2e08d37e59aab4a\\\"\"" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" podUID="e6e02a94-3239-4e8b-8d87-4adb4ebcc98b" Oct 13 18:28:18 crc kubenswrapper[4974]: E1013 18:28:18.925267 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:582f7b1e411961b69f2e3c6b346aa25759b89f7720ed3fade1d363bf5d2dffc8\\\"\"" pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" podUID="bcad591b-b126-4da8-a21c-636d710329b8" Oct 13 18:28:18 crc kubenswrapper[4974]: E1013 18:28:18.931266 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:abe978f8da75223de5043cca50278ad4e28c8dd309883f502fe1e7a9998733b0\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" podUID="7269886e-6ad1-43fe-a8f2-c535dffe836c" Oct 13 18:28:18 crc kubenswrapper[4974]: E1013 18:28:18.931342 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:79b43a69884631c635d2164b95a2d4ec68f5cb33f96da14764f1c710880f3997\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" podUID="2b43f3c2-b280-40e9-9467-181a372011e1" Oct 13 18:28:18 crc kubenswrapper[4974]: E1013 18:28:18.931364 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:4b4a17fe08ce00e375afaaec6a28835f5c1784f03d11c4558376ac04130f3a9e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" podUID="45bbc336-9feb-40e0-b7a9-92fad85e7396" Oct 13 18:28:18 crc kubenswrapper[4974]: E1013 18:28:18.940590 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.119:5001/openstack-k8s-operators/watcher-operator:adafee99798f561d2ca75683d23efd881833abf2\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" podUID="7bb571d8-3894-46f5-a627-932b5dfdc2fd" Oct 13 18:28:19 crc kubenswrapper[4974]: I1013 18:28:19.040990 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" podStartSLOduration=4.040975272 podStartE2EDuration="4.040975272s" podCreationTimestamp="2025-10-13 18:28:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:28:19.016195079 +0000 UTC m=+833.920561159" watchObservedRunningTime="2025-10-13 18:28:19.040975272 +0000 UTC m=+833.945341352" Oct 13 18:28:27 crc kubenswrapper[4974]: I1013 18:28:27.199320 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7995b9c57f-x4jst" Oct 13 18:28:29 crc kubenswrapper[4974]: E1013 18:28:29.091994 4974 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:3cc6bba71197ddf88dd4ba1301542bacbc1fe12e6faab2b69e6960944b3d74a0" Oct 13 18:28:29 crc kubenswrapper[4974]: E1013 18:28:29.092436 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:3cc6bba71197ddf88dd4ba1301542bacbc1fe12e6faab2b69e6960944b3d74a0,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-468zz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-7bb46cd7d-wzvwc_openstack-operators(78805c21-d9b5-4f77-a318-fa1dfa26ebc3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 18:28:29 crc kubenswrapper[4974]: E1013 18:28:29.610862 4974 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:73736f216f886549901fbcfc823b072f73691c9a79ec79e59d100e992b9c1e34" Oct 13 18:28:29 crc kubenswrapper[4974]: E1013 18:28:29.611064 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:73736f216f886549901fbcfc823b072f73691c9a79ec79e59d100e992b9c1e34,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rv5wl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-687df44cdb-h2cmd_openstack-operators(758864e5-2a90-496e-b006-dcfaf42c20bb): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 18:28:29 crc kubenswrapper[4974]: I1013 18:28:29.821822 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 18:28:30 crc kubenswrapper[4974]: E1013 18:28:30.116427 4974 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9" Oct 13 18:28:30 crc kubenswrapper[4974]: E1013 18:28:30.117576 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fnf5g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-59cdc64769-t2hfb_openstack-operators(b44da60c-a4d1-406d-abb8-db29314b9e50): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 18:28:30 crc kubenswrapper[4974]: E1013 18:28:30.496329 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" podUID="758864e5-2a90-496e-b006-dcfaf42c20bb" Oct 13 18:28:30 crc kubenswrapper[4974]: E1013 18:28:30.497220 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" podUID="78805c21-d9b5-4f77-a318-fa1dfa26ebc3" Oct 13 18:28:30 crc kubenswrapper[4974]: E1013 18:28:30.547730 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" podUID="b44da60c-a4d1-406d-abb8-db29314b9e50" Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.145065 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2" event={"ID":"f9ed9202-2a09-42d2-b140-8300e108e36a","Type":"ContainerStarted","Data":"2f66c2787554b8f238af30f6c3d4f871c1923b7bcaabb6aabc2f94cec8b00530"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.173861 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh" event={"ID":"923ead90-d60a-431b-9630-693bdc007237","Type":"ContainerStarted","Data":"306ca700f496817663ca331fb5bc5720be7690c32fbd6c5bc660f6171c7ee40b"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.173908 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh" event={"ID":"923ead90-d60a-431b-9630-693bdc007237","Type":"ContainerStarted","Data":"4b7ff1b5d0ff520f9832293a8c1cd75a4b16f783313c88acda32a9738db69dfd"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.174331 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh" Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.196826 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp" event={"ID":"197d51a8-e30e-485c-8e76-bd4ee120da7b","Type":"ContainerStarted","Data":"5cc8fdcf7c65117fb124eeadd5357d0da7c7daa2cc53f894f706c8e7f63598ab"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.221681 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" event={"ID":"d95330d6-c9ed-4fe6-8daa-6ef9495e72ae","Type":"ContainerStarted","Data":"c9725457eb5caf1bfe16dd0eef3084e0eec26581d9a929bf0cb36fe762f2c356"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.249012 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" event={"ID":"78805c21-d9b5-4f77-a318-fa1dfa26ebc3","Type":"ContainerStarted","Data":"cc12adb7a1da33de5e4241ff0b25f98bdbc555d58d4678c96bd7655b20e37565"} Oct 13 18:28:31 crc kubenswrapper[4974]: E1013 18:28:31.250483 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:3cc6bba71197ddf88dd4ba1301542bacbc1fe12e6faab2b69e6960944b3d74a0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" podUID="78805c21-d9b5-4f77-a318-fa1dfa26ebc3" Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.266853 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" event={"ID":"b44da60c-a4d1-406d-abb8-db29314b9e50","Type":"ContainerStarted","Data":"4eb403af337e24615149034bcf802d20b0a2057f8a855b9a5499586abc58ce62"} Oct 13 18:28:31 crc kubenswrapper[4974]: E1013 18:28:31.279847 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" podUID="b44da60c-a4d1-406d-abb8-db29314b9e50" Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.293063 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-scqj9" event={"ID":"9a259044-9901-4a97-89f7-965118976af7","Type":"ContainerStarted","Data":"c0443e1df972566169a52e8f53866bdcda3714e3c658cf381b56f0f07cb04130"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.293589 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh" podStartSLOduration=4.133693392 podStartE2EDuration="17.2935723s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:17.022955963 +0000 UTC m=+831.927322053" lastFinishedPulling="2025-10-13 18:28:30.182834871 +0000 UTC m=+845.087200961" observedRunningTime="2025-10-13 18:28:31.269835048 +0000 UTC m=+846.174201128" watchObservedRunningTime="2025-10-13 18:28:31.2935723 +0000 UTC m=+846.197938380" Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.321310 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj" event={"ID":"e5d3e6f8-15bf-4544-b701-da591158af75","Type":"ContainerStarted","Data":"8842be63227a9e9b9016be4aa0fcd4573fb35e2509672941ca7878b97d07c35e"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.372011 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k" event={"ID":"50ce5538-ff95-4983-8ff7-3a406b974617","Type":"ContainerStarted","Data":"72605360f310e952aaf0d41c73cc45f5e0364856432bea5f1008d02ca3d88c21"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.410262 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l" event={"ID":"f332d432-86f0-4c0b-80d6-dba6e2920a81","Type":"ContainerStarted","Data":"8ee88220c19ab41467022a4b38ff33f8267a20a8bc6597d8041fe7fe4d4d4202"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.452459 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r" event={"ID":"86f89f48-3e17-4ed9-9cbb-6458223a1864","Type":"ContainerStarted","Data":"4bca54eb3c88b4dbfa4137d8b3bafa4f2367a5c9232d6bfa3d515085336fa878"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.453164 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r" Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.475673 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-5f97d8c699-scqj9" podStartSLOduration=3.289075018 podStartE2EDuration="16.475645084s" podCreationTimestamp="2025-10-13 18:28:15 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.995154864 +0000 UTC m=+831.899520944" lastFinishedPulling="2025-10-13 18:28:30.18172493 +0000 UTC m=+845.086091010" observedRunningTime="2025-10-13 18:28:31.458678453 +0000 UTC m=+846.363044533" watchObservedRunningTime="2025-10-13 18:28:31.475645084 +0000 UTC m=+846.380011164" Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.484363 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" event={"ID":"758864e5-2a90-496e-b006-dcfaf42c20bb","Type":"ContainerStarted","Data":"28050b4ca6b0ea2981f89acf4a9001e27af7755e243c181061a78195e755f490"} Oct 13 18:28:31 crc kubenswrapper[4974]: E1013 18:28:31.491350 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:73736f216f886549901fbcfc823b072f73691c9a79ec79e59d100e992b9c1e34\\\"\"" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" podUID="758864e5-2a90-496e-b006-dcfaf42c20bb" Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.494947 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm" event={"ID":"c10ae245-c899-4ea9-9edb-d62b176d19cc","Type":"ContainerStarted","Data":"456beb69351323a19f4768bcdce3846de20e987845c9396826844f62415aa2bc"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.495915 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" event={"ID":"e152664c-85e7-4854-8960-ee413a7eb3a3","Type":"ContainerStarted","Data":"c74df0425b79bc0e4d47d15938a2f99c0b543b4e2e5580094d836d4895894f89"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.496766 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv" event={"ID":"ef8af802-f6f6-4018-9bfd-f8aee92ff838","Type":"ContainerStarted","Data":"1501edb43b6736bb4cad618040d1e268e2309063d610fc14564d43c8d0357414"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.497671 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf" event={"ID":"4cbd873e-490d-4f1c-91cc-4ca45f109d7f","Type":"ContainerStarted","Data":"219f572919b89c8195e278f636dfb13fdbfb6b5a36f37416ad32244491647300"} Oct 13 18:28:31 crc kubenswrapper[4974]: I1013 18:28:31.598805 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r" podStartSLOduration=3.7501206270000003 podStartE2EDuration="17.598776249s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.328813257 +0000 UTC m=+831.233179337" lastFinishedPulling="2025-10-13 18:28:30.177468839 +0000 UTC m=+845.081834959" observedRunningTime="2025-10-13 18:28:31.522724976 +0000 UTC m=+846.427091056" watchObservedRunningTime="2025-10-13 18:28:31.598776249 +0000 UTC m=+846.503142329" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.521585 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp" event={"ID":"197d51a8-e30e-485c-8e76-bd4ee120da7b","Type":"ContainerStarted","Data":"7d3f1b7116bb21a5a2fb0b3ba731769b612687abd12a497a91dc95fe8975c96e"} Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.522047 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.524191 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" event={"ID":"d95330d6-c9ed-4fe6-8daa-6ef9495e72ae","Type":"ContainerStarted","Data":"0c2389d08fda147c8a86ba7ac79a2638263435c6674b6a9fdf56136fe9295257"} Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.525021 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.530170 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k" event={"ID":"50ce5538-ff95-4983-8ff7-3a406b974617","Type":"ContainerStarted","Data":"6a9104edde5bcf4db3ad82148da0f6f7acb5be5fbc8fc624f75c589e74c9e271"} Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.531420 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.533488 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2" event={"ID":"f9ed9202-2a09-42d2-b140-8300e108e36a","Type":"ContainerStarted","Data":"9de47a7b65f5018013e6e62bf7e7b746ae9f457d1d2c5b67cdad6b4be08438a0"} Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.536829 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp" podStartSLOduration=4.874950241 podStartE2EDuration="18.53681068s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.519924082 +0000 UTC m=+831.424290162" lastFinishedPulling="2025-10-13 18:28:30.181784511 +0000 UTC m=+845.086150601" observedRunningTime="2025-10-13 18:28:32.535965856 +0000 UTC m=+847.440331936" watchObservedRunningTime="2025-10-13 18:28:32.53681068 +0000 UTC m=+847.441176760" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.537846 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.539693 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf" event={"ID":"4cbd873e-490d-4f1c-91cc-4ca45f109d7f","Type":"ContainerStarted","Data":"e05dca56e103ab11c3889bc96b789c6d591c1e75f2d3694aa3bc20d760c661ae"} Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.539925 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.550245 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r" event={"ID":"86f89f48-3e17-4ed9-9cbb-6458223a1864","Type":"ContainerStarted","Data":"830a79c8c1af8d1c93c89016a69bb6c70ee0f5bb32ebbf579dde8147233f3405"} Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.555389 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm" event={"ID":"c10ae245-c899-4ea9-9edb-d62b176d19cc","Type":"ContainerStarted","Data":"a2e5ed43da87a39e2cc49aea3604a960f26a7058ea52999755199218b2a8ff63"} Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.555495 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.565534 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" podStartSLOduration=6.081792649 podStartE2EDuration="18.565518532s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:17.698074839 +0000 UTC m=+832.602440919" lastFinishedPulling="2025-10-13 18:28:30.181800712 +0000 UTC m=+845.086166802" observedRunningTime="2025-10-13 18:28:32.563404313 +0000 UTC m=+847.467770393" watchObservedRunningTime="2025-10-13 18:28:32.565518532 +0000 UTC m=+847.469884612" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.569237 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj" event={"ID":"e5d3e6f8-15bf-4544-b701-da591158af75","Type":"ContainerStarted","Data":"a25f4f6004ee80cccde6eeea62ed6efe50d4ea5bb6ac4bdf74536e16c8287971"} Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.569865 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.576383 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" event={"ID":"e152664c-85e7-4854-8960-ee413a7eb3a3","Type":"ContainerStarted","Data":"ca259be346e5496678e1f3ddb205f463d05f2580e74d1ee32ac82ea6db35eb04"} Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.576541 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.584795 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l" event={"ID":"f332d432-86f0-4c0b-80d6-dba6e2920a81","Type":"ContainerStarted","Data":"fe8accd11544bb2baf457bd9e44f9c7ad7a0494a71b21f018a92df4a8a696158"} Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.585444 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.586976 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2" podStartSLOduration=5.07729133 podStartE2EDuration="18.586964659s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.672120113 +0000 UTC m=+831.576486203" lastFinishedPulling="2025-10-13 18:28:30.181793432 +0000 UTC m=+845.086159532" observedRunningTime="2025-10-13 18:28:32.582683558 +0000 UTC m=+847.487049638" watchObservedRunningTime="2025-10-13 18:28:32.586964659 +0000 UTC m=+847.491330739" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.596357 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv" event={"ID":"ef8af802-f6f6-4018-9bfd-f8aee92ff838","Type":"ContainerStarted","Data":"3878cd2db1c80ed771776d0aab68f055a0d85805616513b67edebffcca97b232"} Oct 13 18:28:32 crc kubenswrapper[4974]: E1013 18:28:32.605742 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:c487a793648e64af2d64df5f6efbda2d4fd586acd7aee6838d3ec2b3edd9efb9\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" podUID="b44da60c-a4d1-406d-abb8-db29314b9e50" Oct 13 18:28:32 crc kubenswrapper[4974]: E1013 18:28:32.605750 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:73736f216f886549901fbcfc823b072f73691c9a79ec79e59d100e992b9c1e34\\\"\"" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" podUID="758864e5-2a90-496e-b006-dcfaf42c20bb" Oct 13 18:28:32 crc kubenswrapper[4974]: E1013 18:28:32.605849 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:3cc6bba71197ddf88dd4ba1301542bacbc1fe12e6faab2b69e6960944b3d74a0\\\"\"" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" podUID="78805c21-d9b5-4f77-a318-fa1dfa26ebc3" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.633442 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k" podStartSLOduration=5.02024643 podStartE2EDuration="18.633422764s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.609951398 +0000 UTC m=+831.514317478" lastFinishedPulling="2025-10-13 18:28:30.223127722 +0000 UTC m=+845.127493812" observedRunningTime="2025-10-13 18:28:32.596009726 +0000 UTC m=+847.500375806" watchObservedRunningTime="2025-10-13 18:28:32.633422764 +0000 UTC m=+847.537788844" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.642316 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj" podStartSLOduration=4.613229604 podStartE2EDuration="18.642299176s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.177842581 +0000 UTC m=+831.082208661" lastFinishedPulling="2025-10-13 18:28:30.206912143 +0000 UTC m=+845.111278233" observedRunningTime="2025-10-13 18:28:32.631319115 +0000 UTC m=+847.535685185" watchObservedRunningTime="2025-10-13 18:28:32.642299176 +0000 UTC m=+847.546665256" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.666082 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l" podStartSLOduration=4.674674194 podStartE2EDuration="18.666057848s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.190443389 +0000 UTC m=+831.094809459" lastFinishedPulling="2025-10-13 18:28:30.181827013 +0000 UTC m=+845.086193113" observedRunningTime="2025-10-13 18:28:32.649597162 +0000 UTC m=+847.553963242" watchObservedRunningTime="2025-10-13 18:28:32.666057848 +0000 UTC m=+847.570423928" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.676572 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" podStartSLOduration=5.102433407 podStartE2EDuration="18.676551275s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.607608062 +0000 UTC m=+831.511974142" lastFinishedPulling="2025-10-13 18:28:30.18172592 +0000 UTC m=+845.086092010" observedRunningTime="2025-10-13 18:28:32.672461459 +0000 UTC m=+847.576827539" watchObservedRunningTime="2025-10-13 18:28:32.676551275 +0000 UTC m=+847.580917345" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.694419 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm" podStartSLOduration=5.163998473 podStartE2EDuration="18.69439769s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.651366984 +0000 UTC m=+831.555733074" lastFinishedPulling="2025-10-13 18:28:30.181766171 +0000 UTC m=+845.086132291" observedRunningTime="2025-10-13 18:28:32.684301115 +0000 UTC m=+847.588667195" watchObservedRunningTime="2025-10-13 18:28:32.69439769 +0000 UTC m=+847.598763770" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.702881 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf" podStartSLOduration=5.131744137 podStartE2EDuration="18.70285874s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.60648229 +0000 UTC m=+831.510848370" lastFinishedPulling="2025-10-13 18:28:30.177596883 +0000 UTC m=+845.081962973" observedRunningTime="2025-10-13 18:28:32.699389232 +0000 UTC m=+847.603755312" watchObservedRunningTime="2025-10-13 18:28:32.70285874 +0000 UTC m=+847.607224820" Oct 13 18:28:32 crc kubenswrapper[4974]: I1013 18:28:32.751950 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv" podStartSLOduration=4.477435442 podStartE2EDuration="18.751925339s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:15.947958955 +0000 UTC m=+830.852325035" lastFinishedPulling="2025-10-13 18:28:30.222448842 +0000 UTC m=+845.126814932" observedRunningTime="2025-10-13 18:28:32.749252773 +0000 UTC m=+847.653618853" watchObservedRunningTime="2025-10-13 18:28:32.751925339 +0000 UTC m=+847.656291419" Oct 13 18:28:33 crc kubenswrapper[4974]: I1013 18:28:33.609395 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv" Oct 13 18:28:35 crc kubenswrapper[4974]: I1013 18:28:35.012969 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-74cb5cbc49-w4lrj" Oct 13 18:28:35 crc kubenswrapper[4974]: I1013 18:28:35.113385 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-585fc5b659-rzm52" Oct 13 18:28:35 crc kubenswrapper[4974]: I1013 18:28:35.221187 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-6d9967f8dd-n4n2k" Oct 13 18:28:35 crc kubenswrapper[4974]: I1013 18:28:35.257292 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5777b4f897-gq4sm" Oct 13 18:28:35 crc kubenswrapper[4974]: I1013 18:28:35.275255 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-797d478b46-7p29r" Oct 13 18:28:35 crc kubenswrapper[4974]: I1013 18:28:35.279398 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-664664cb68-rs4rf" Oct 13 18:28:35 crc kubenswrapper[4974]: I1013 18:28:35.295730 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-57bb74c7bf-2xgsp" Oct 13 18:28:35 crc kubenswrapper[4974]: I1013 18:28:35.324570 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-6d7c7ddf95-zx7qh" Oct 13 18:28:35 crc kubenswrapper[4974]: I1013 18:28:35.325184 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-869cc7797f-8ftd2" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.631257 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" event={"ID":"2b43f3c2-b280-40e9-9467-181a372011e1","Type":"ContainerStarted","Data":"b825fa7de45eba3d86a840cc01cce9be76f4add0a9b1b5c464f0595f9563a921"} Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.631809 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.635924 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" event={"ID":"bcad591b-b126-4da8-a21c-636d710329b8","Type":"ContainerStarted","Data":"d5128236f3b3c5776b2638f6a4e92d7d3cfd0555735e727073764e88f3b98531"} Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.636150 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.638108 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" event={"ID":"7bb571d8-3894-46f5-a627-932b5dfdc2fd","Type":"ContainerStarted","Data":"42b86e881648cefbbb579934d9a258e34b3963ff3fbe5e931253b6a814caff31"} Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.638366 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.641339 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" event={"ID":"e6e02a94-3239-4e8b-8d87-4adb4ebcc98b","Type":"ContainerStarted","Data":"48e62446f8db39f0b3d450df927c363c6e1b9f60b972ed14756f123f1177379a"} Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.642060 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.643853 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" event={"ID":"45bbc336-9feb-40e0-b7a9-92fad85e7396","Type":"ContainerStarted","Data":"9159869edc179fadcee8472759437c3213ac6e2098be761c164c86ba882ac002"} Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.644036 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.645759 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" event={"ID":"7269886e-6ad1-43fe-a8f2-c535dffe836c","Type":"ContainerStarted","Data":"b7f6610aa5a60e13396d763f2be7aaa99e8fdfb652de3ec7e432c7c0728ac4c3"} Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.646532 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.651386 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" podStartSLOduration=3.733544412 podStartE2EDuration="22.65136667s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:17.096695557 +0000 UTC m=+832.001061637" lastFinishedPulling="2025-10-13 18:28:36.014517805 +0000 UTC m=+850.918883895" observedRunningTime="2025-10-13 18:28:36.649082515 +0000 UTC m=+851.553448605" watchObservedRunningTime="2025-10-13 18:28:36.65136667 +0000 UTC m=+851.555732750" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.683545 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" podStartSLOduration=3.350999559 podStartE2EDuration="22.68352648s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.681797498 +0000 UTC m=+831.586163578" lastFinishedPulling="2025-10-13 18:28:36.014324379 +0000 UTC m=+850.918690499" observedRunningTime="2025-10-13 18:28:36.676737628 +0000 UTC m=+851.581103708" watchObservedRunningTime="2025-10-13 18:28:36.68352648 +0000 UTC m=+851.587892560" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.693520 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" podStartSLOduration=3.73827959 podStartE2EDuration="22.693502872s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:17.043185177 +0000 UTC m=+831.947551267" lastFinishedPulling="2025-10-13 18:28:35.998408469 +0000 UTC m=+850.902774549" observedRunningTime="2025-10-13 18:28:36.690034364 +0000 UTC m=+851.594400454" watchObservedRunningTime="2025-10-13 18:28:36.693502872 +0000 UTC m=+851.597868952" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.708516 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" podStartSLOduration=2.717398731 podStartE2EDuration="21.708499067s" podCreationTimestamp="2025-10-13 18:28:15 +0000 UTC" firstStartedPulling="2025-10-13 18:28:17.048122848 +0000 UTC m=+831.952488928" lastFinishedPulling="2025-10-13 18:28:36.039223174 +0000 UTC m=+850.943589264" observedRunningTime="2025-10-13 18:28:36.707258942 +0000 UTC m=+851.611625032" watchObservedRunningTime="2025-10-13 18:28:36.708499067 +0000 UTC m=+851.612865147" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.730534 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" podStartSLOduration=3.764338099 podStartE2EDuration="22.73051061s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:17.047875011 +0000 UTC m=+831.952241101" lastFinishedPulling="2025-10-13 18:28:36.014047532 +0000 UTC m=+850.918413612" observedRunningTime="2025-10-13 18:28:36.726741803 +0000 UTC m=+851.631107893" watchObservedRunningTime="2025-10-13 18:28:36.73051061 +0000 UTC m=+851.634876690" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.744920 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" podStartSLOduration=3.743744674 podStartE2EDuration="22.744901457s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:17.022969354 +0000 UTC m=+831.927335434" lastFinishedPulling="2025-10-13 18:28:36.024126137 +0000 UTC m=+850.928492217" observedRunningTime="2025-10-13 18:28:36.741851881 +0000 UTC m=+851.646217961" watchObservedRunningTime="2025-10-13 18:28:36.744901457 +0000 UTC m=+851.649267537" Oct 13 18:28:36 crc kubenswrapper[4974]: I1013 18:28:36.886700 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w" Oct 13 18:28:37 crc kubenswrapper[4974]: I1013 18:28:37.743947 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:28:37 crc kubenswrapper[4974]: I1013 18:28:37.744373 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:28:37 crc kubenswrapper[4974]: I1013 18:28:37.744441 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:28:37 crc kubenswrapper[4974]: I1013 18:28:37.745350 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"436597ac77fb62acd2a6755e030d52c331e47424c1685e9a911b5a1473f796ca"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 18:28:37 crc kubenswrapper[4974]: I1013 18:28:37.745436 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://436597ac77fb62acd2a6755e030d52c331e47424c1685e9a911b5a1473f796ca" gracePeriod=600 Oct 13 18:28:38 crc kubenswrapper[4974]: I1013 18:28:38.667813 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="436597ac77fb62acd2a6755e030d52c331e47424c1685e9a911b5a1473f796ca" exitCode=0 Oct 13 18:28:38 crc kubenswrapper[4974]: I1013 18:28:38.667857 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"436597ac77fb62acd2a6755e030d52c331e47424c1685e9a911b5a1473f796ca"} Oct 13 18:28:38 crc kubenswrapper[4974]: I1013 18:28:38.668618 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"171b1edc0ad6306edaf67441f6ae19fb0da0e0db23e98eea0abca2248299e8ae"} Oct 13 18:28:38 crc kubenswrapper[4974]: I1013 18:28:38.668713 4974 scope.go:117] "RemoveContainer" containerID="8894b6a4641af63269e4147cdef54f2c3c1eada1368d28e6c504cfd79085b430" Oct 13 18:28:44 crc kubenswrapper[4974]: I1013 18:28:44.844552 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-64f84fcdbb-hvqbv" Oct 13 18:28:44 crc kubenswrapper[4974]: I1013 18:28:44.975281 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d74794d9b-cp96l" Oct 13 18:28:45 crc kubenswrapper[4974]: I1013 18:28:45.325365 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f4d5dfdc6-mj7kp" Oct 13 18:28:45 crc kubenswrapper[4974]: I1013 18:28:45.416751 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-ddb98f99b-hvwzb" Oct 13 18:28:45 crc kubenswrapper[4974]: I1013 18:28:45.449681 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-59578bc799-zkfrg" Oct 13 18:28:45 crc kubenswrapper[4974]: I1013 18:28:45.616281 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-578874c84d-2q9c8" Oct 13 18:28:45 crc kubenswrapper[4974]: I1013 18:28:45.639079 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-ffcdd6c94-lb8ln" Oct 13 18:28:45 crc kubenswrapper[4974]: I1013 18:28:45.672739 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6f64d8b78-d6wjd" Oct 13 18:28:48 crc kubenswrapper[4974]: I1013 18:28:48.791511 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" event={"ID":"b44da60c-a4d1-406d-abb8-db29314b9e50","Type":"ContainerStarted","Data":"585c8b98848012a229f7ac656df88eb2f33329a0603bc0ec0568210588aa8dd6"} Oct 13 18:28:48 crc kubenswrapper[4974]: I1013 18:28:48.792464 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" Oct 13 18:28:48 crc kubenswrapper[4974]: I1013 18:28:48.797586 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" event={"ID":"758864e5-2a90-496e-b006-dcfaf42c20bb","Type":"ContainerStarted","Data":"05ca5d66c66bed392a5f1d37138c79087353bedd534a483ba3a8101c8ee78166"} Oct 13 18:28:48 crc kubenswrapper[4974]: I1013 18:28:48.800156 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" Oct 13 18:28:48 crc kubenswrapper[4974]: I1013 18:28:48.808175 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" event={"ID":"78805c21-d9b5-4f77-a318-fa1dfa26ebc3","Type":"ContainerStarted","Data":"62e7480eb6f88b9e705e15b9e297687c9bb7b51cb64aeaea2e1ff9a3856e8e73"} Oct 13 18:28:48 crc kubenswrapper[4974]: I1013 18:28:48.808899 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" Oct 13 18:28:48 crc kubenswrapper[4974]: I1013 18:28:48.827701 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" podStartSLOduration=3.166683142 podStartE2EDuration="34.827672245s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:15.963158916 +0000 UTC m=+830.867524996" lastFinishedPulling="2025-10-13 18:28:47.624148019 +0000 UTC m=+862.528514099" observedRunningTime="2025-10-13 18:28:48.823722303 +0000 UTC m=+863.728088383" watchObservedRunningTime="2025-10-13 18:28:48.827672245 +0000 UTC m=+863.732038375" Oct 13 18:28:48 crc kubenswrapper[4974]: I1013 18:28:48.841470 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" podStartSLOduration=3.496519037 podStartE2EDuration="34.841444965s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.197794237 +0000 UTC m=+831.102160317" lastFinishedPulling="2025-10-13 18:28:47.542720165 +0000 UTC m=+862.447086245" observedRunningTime="2025-10-13 18:28:48.836840615 +0000 UTC m=+863.741206715" watchObservedRunningTime="2025-10-13 18:28:48.841444965 +0000 UTC m=+863.745811045" Oct 13 18:28:48 crc kubenswrapper[4974]: I1013 18:28:48.853568 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" podStartSLOduration=3.669264668 podStartE2EDuration="34.853545377s" podCreationTimestamp="2025-10-13 18:28:14 +0000 UTC" firstStartedPulling="2025-10-13 18:28:16.328414926 +0000 UTC m=+831.232781006" lastFinishedPulling="2025-10-13 18:28:47.512695625 +0000 UTC m=+862.417061715" observedRunningTime="2025-10-13 18:28:48.85188413 +0000 UTC m=+863.756250220" watchObservedRunningTime="2025-10-13 18:28:48.853545377 +0000 UTC m=+863.757911457" Oct 13 18:28:54 crc kubenswrapper[4974]: I1013 18:28:54.849628 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-59cdc64769-t2hfb" Oct 13 18:28:54 crc kubenswrapper[4974]: I1013 18:28:54.875249 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-687df44cdb-h2cmd" Oct 13 18:28:54 crc kubenswrapper[4974]: I1013 18:28:54.877094 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7bb46cd7d-wzvwc" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.118109 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8ff9c764f-zjfr2"] Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.120133 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.122588 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8h7k5" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.122834 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.125853 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.125868 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.132718 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8ff9c764f-zjfr2"] Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.193778 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68587c85b9-rcsdn"] Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.194933 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.201730 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.209322 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68587c85b9-rcsdn"] Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.241212 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m244n\" (UniqueName: \"kubernetes.io/projected/348ed400-353d-42f7-86e3-b6360bba1408-kube-api-access-m244n\") pod \"dnsmasq-dns-8ff9c764f-zjfr2\" (UID: \"348ed400-353d-42f7-86e3-b6360bba1408\") " pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.241255 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348ed400-353d-42f7-86e3-b6360bba1408-config\") pod \"dnsmasq-dns-8ff9c764f-zjfr2\" (UID: \"348ed400-353d-42f7-86e3-b6360bba1408\") " pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.343122 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-config\") pod \"dnsmasq-dns-68587c85b9-rcsdn\" (UID: \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\") " pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.343233 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-dns-svc\") pod \"dnsmasq-dns-68587c85b9-rcsdn\" (UID: \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\") " pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.343267 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m244n\" (UniqueName: \"kubernetes.io/projected/348ed400-353d-42f7-86e3-b6360bba1408-kube-api-access-m244n\") pod \"dnsmasq-dns-8ff9c764f-zjfr2\" (UID: \"348ed400-353d-42f7-86e3-b6360bba1408\") " pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.343292 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nghsn\" (UniqueName: \"kubernetes.io/projected/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-kube-api-access-nghsn\") pod \"dnsmasq-dns-68587c85b9-rcsdn\" (UID: \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\") " pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.343328 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348ed400-353d-42f7-86e3-b6360bba1408-config\") pod \"dnsmasq-dns-8ff9c764f-zjfr2\" (UID: \"348ed400-353d-42f7-86e3-b6360bba1408\") " pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.344250 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348ed400-353d-42f7-86e3-b6360bba1408-config\") pod \"dnsmasq-dns-8ff9c764f-zjfr2\" (UID: \"348ed400-353d-42f7-86e3-b6360bba1408\") " pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.367625 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m244n\" (UniqueName: \"kubernetes.io/projected/348ed400-353d-42f7-86e3-b6360bba1408-kube-api-access-m244n\") pod \"dnsmasq-dns-8ff9c764f-zjfr2\" (UID: \"348ed400-353d-42f7-86e3-b6360bba1408\") " pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.437407 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.444380 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-config\") pod \"dnsmasq-dns-68587c85b9-rcsdn\" (UID: \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\") " pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.444847 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-dns-svc\") pod \"dnsmasq-dns-68587c85b9-rcsdn\" (UID: \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\") " pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.444887 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nghsn\" (UniqueName: \"kubernetes.io/projected/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-kube-api-access-nghsn\") pod \"dnsmasq-dns-68587c85b9-rcsdn\" (UID: \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\") " pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.445359 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-config\") pod \"dnsmasq-dns-68587c85b9-rcsdn\" (UID: \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\") " pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.445847 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-dns-svc\") pod \"dnsmasq-dns-68587c85b9-rcsdn\" (UID: \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\") " pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.462522 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nghsn\" (UniqueName: \"kubernetes.io/projected/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-kube-api-access-nghsn\") pod \"dnsmasq-dns-68587c85b9-rcsdn\" (UID: \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\") " pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.536419 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.912549 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8ff9c764f-zjfr2"] Oct 13 18:29:14 crc kubenswrapper[4974]: I1013 18:29:14.983679 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68587c85b9-rcsdn"] Oct 13 18:29:14 crc kubenswrapper[4974]: W1013 18:29:14.994399 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f46dab1_c765_4cbb_8f0d_41071cf7bc1a.slice/crio-dc484265b9d97af21ea0b2e231adeb8392820808f77d4eca0429375434de572e WatchSource:0}: Error finding container dc484265b9d97af21ea0b2e231adeb8392820808f77d4eca0429375434de572e: Status 404 returned error can't find the container with id dc484265b9d97af21ea0b2e231adeb8392820808f77d4eca0429375434de572e Oct 13 18:29:15 crc kubenswrapper[4974]: I1013 18:29:15.060039 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" event={"ID":"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a","Type":"ContainerStarted","Data":"dc484265b9d97af21ea0b2e231adeb8392820808f77d4eca0429375434de572e"} Oct 13 18:29:15 crc kubenswrapper[4974]: I1013 18:29:15.065283 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" event={"ID":"348ed400-353d-42f7-86e3-b6360bba1408","Type":"ContainerStarted","Data":"95c846d8c5c9be1d6a3975e755b406c8410847bca3b4be0f9faa7c6f1466b2a6"} Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.266593 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68587c85b9-rcsdn"] Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.295528 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5788458d7f-4dccc"] Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.296677 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.306518 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5788458d7f-4dccc"] Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.398524 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/587562fd-5751-4c44-970b-bccd97cf59b2-dns-svc\") pod \"dnsmasq-dns-5788458d7f-4dccc\" (UID: \"587562fd-5751-4c44-970b-bccd97cf59b2\") " pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.398732 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmbmt\" (UniqueName: \"kubernetes.io/projected/587562fd-5751-4c44-970b-bccd97cf59b2-kube-api-access-xmbmt\") pod \"dnsmasq-dns-5788458d7f-4dccc\" (UID: \"587562fd-5751-4c44-970b-bccd97cf59b2\") " pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.398814 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587562fd-5751-4c44-970b-bccd97cf59b2-config\") pod \"dnsmasq-dns-5788458d7f-4dccc\" (UID: \"587562fd-5751-4c44-970b-bccd97cf59b2\") " pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.499852 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587562fd-5751-4c44-970b-bccd97cf59b2-config\") pod \"dnsmasq-dns-5788458d7f-4dccc\" (UID: \"587562fd-5751-4c44-970b-bccd97cf59b2\") " pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.499909 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/587562fd-5751-4c44-970b-bccd97cf59b2-dns-svc\") pod \"dnsmasq-dns-5788458d7f-4dccc\" (UID: \"587562fd-5751-4c44-970b-bccd97cf59b2\") " pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.499971 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmbmt\" (UniqueName: \"kubernetes.io/projected/587562fd-5751-4c44-970b-bccd97cf59b2-kube-api-access-xmbmt\") pod \"dnsmasq-dns-5788458d7f-4dccc\" (UID: \"587562fd-5751-4c44-970b-bccd97cf59b2\") " pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.501318 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587562fd-5751-4c44-970b-bccd97cf59b2-config\") pod \"dnsmasq-dns-5788458d7f-4dccc\" (UID: \"587562fd-5751-4c44-970b-bccd97cf59b2\") " pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.501853 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/587562fd-5751-4c44-970b-bccd97cf59b2-dns-svc\") pod \"dnsmasq-dns-5788458d7f-4dccc\" (UID: \"587562fd-5751-4c44-970b-bccd97cf59b2\") " pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.520842 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmbmt\" (UniqueName: \"kubernetes.io/projected/587562fd-5751-4c44-970b-bccd97cf59b2-kube-api-access-xmbmt\") pod \"dnsmasq-dns-5788458d7f-4dccc\" (UID: \"587562fd-5751-4c44-970b-bccd97cf59b2\") " pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.616052 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.646113 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8ff9c764f-zjfr2"] Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.657119 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79f9c66c79-svxxk"] Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.658585 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.682341 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f9c66c79-svxxk"] Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.808877 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqjrb\" (UniqueName: \"kubernetes.io/projected/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-kube-api-access-rqjrb\") pod \"dnsmasq-dns-79f9c66c79-svxxk\" (UID: \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\") " pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.808936 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-dns-svc\") pod \"dnsmasq-dns-79f9c66c79-svxxk\" (UID: \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\") " pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.809025 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-config\") pod \"dnsmasq-dns-79f9c66c79-svxxk\" (UID: \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\") " pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.910742 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-config\") pod \"dnsmasq-dns-79f9c66c79-svxxk\" (UID: \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\") " pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.910874 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqjrb\" (UniqueName: \"kubernetes.io/projected/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-kube-api-access-rqjrb\") pod \"dnsmasq-dns-79f9c66c79-svxxk\" (UID: \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\") " pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.910908 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-dns-svc\") pod \"dnsmasq-dns-79f9c66c79-svxxk\" (UID: \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\") " pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.912040 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-config\") pod \"dnsmasq-dns-79f9c66c79-svxxk\" (UID: \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\") " pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.912218 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-dns-svc\") pod \"dnsmasq-dns-79f9c66c79-svxxk\" (UID: \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\") " pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.925546 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5788458d7f-4dccc"] Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.933082 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqjrb\" (UniqueName: \"kubernetes.io/projected/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-kube-api-access-rqjrb\") pod \"dnsmasq-dns-79f9c66c79-svxxk\" (UID: \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\") " pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.965372 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69f8f5886f-tmcvm"] Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.967001 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.976060 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:18 crc kubenswrapper[4974]: I1013 18:29:18.982926 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69f8f5886f-tmcvm"] Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.113955 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-dns-svc\") pod \"dnsmasq-dns-69f8f5886f-tmcvm\" (UID: \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\") " pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.114038 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x58sp\" (UniqueName: \"kubernetes.io/projected/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-kube-api-access-x58sp\") pod \"dnsmasq-dns-69f8f5886f-tmcvm\" (UID: \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\") " pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.114104 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-config\") pod \"dnsmasq-dns-69f8f5886f-tmcvm\" (UID: \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\") " pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.215179 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-dns-svc\") pod \"dnsmasq-dns-69f8f5886f-tmcvm\" (UID: \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\") " pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.215257 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x58sp\" (UniqueName: \"kubernetes.io/projected/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-kube-api-access-x58sp\") pod \"dnsmasq-dns-69f8f5886f-tmcvm\" (UID: \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\") " pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.215338 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-config\") pod \"dnsmasq-dns-69f8f5886f-tmcvm\" (UID: \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\") " pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.216281 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-config\") pod \"dnsmasq-dns-69f8f5886f-tmcvm\" (UID: \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\") " pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.216397 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-dns-svc\") pod \"dnsmasq-dns-69f8f5886f-tmcvm\" (UID: \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\") " pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.230799 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x58sp\" (UniqueName: \"kubernetes.io/projected/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-kube-api-access-x58sp\") pod \"dnsmasq-dns-69f8f5886f-tmcvm\" (UID: \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\") " pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.291095 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.461092 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.464702 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.467219 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.467903 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.468068 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wjncs" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.468197 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.468338 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.468417 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.468547 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.485778 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.620805 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.620876 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr2wt\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-kube-api-access-xr2wt\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.621035 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.621086 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.621121 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.621136 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.621157 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.621281 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.621382 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.621455 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.621554 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.723353 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.723427 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.723494 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.723549 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.723606 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr2wt\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-kube-api-access-xr2wt\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.723685 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.723728 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.723762 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.723794 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.723825 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.723878 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.724181 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.725178 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.726125 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.728174 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.728702 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.729027 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.735143 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.743623 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.744257 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.746703 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.748062 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr2wt\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-kube-api-access-xr2wt\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.757607 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.778142 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.779310 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.781602 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.781967 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.782109 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.782421 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-z2rqt" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.782577 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.782724 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.782920 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.795221 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.795454 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.926344 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3edaa1a-d213-473f-963a-3bfea41226ec-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.926419 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3edaa1a-d213-473f-963a-3bfea41226ec-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.926443 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3edaa1a-d213-473f-963a-3bfea41226ec-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.926480 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3edaa1a-d213-473f-963a-3bfea41226ec-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.926500 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3edaa1a-d213-473f-963a-3bfea41226ec-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.926533 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.926610 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4mxr\" (UniqueName: \"kubernetes.io/projected/a3edaa1a-d213-473f-963a-3bfea41226ec-kube-api-access-d4mxr\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.926680 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3edaa1a-d213-473f-963a-3bfea41226ec-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.926712 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3edaa1a-d213-473f-963a-3bfea41226ec-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.926738 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3edaa1a-d213-473f-963a-3bfea41226ec-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:19 crc kubenswrapper[4974]: I1013 18:29:19.926768 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3edaa1a-d213-473f-963a-3bfea41226ec-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.028866 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3edaa1a-d213-473f-963a-3bfea41226ec-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.028948 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3edaa1a-d213-473f-963a-3bfea41226ec-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.028974 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3edaa1a-d213-473f-963a-3bfea41226ec-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.029008 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3edaa1a-d213-473f-963a-3bfea41226ec-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.029030 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3edaa1a-d213-473f-963a-3bfea41226ec-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.029066 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.029103 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4mxr\" (UniqueName: \"kubernetes.io/projected/a3edaa1a-d213-473f-963a-3bfea41226ec-kube-api-access-d4mxr\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.029139 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3edaa1a-d213-473f-963a-3bfea41226ec-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.029168 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3edaa1a-d213-473f-963a-3bfea41226ec-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.029193 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3edaa1a-d213-473f-963a-3bfea41226ec-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.029225 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3edaa1a-d213-473f-963a-3bfea41226ec-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.031063 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.031282 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a3edaa1a-d213-473f-963a-3bfea41226ec-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.031448 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a3edaa1a-d213-473f-963a-3bfea41226ec-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.031633 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a3edaa1a-d213-473f-963a-3bfea41226ec-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.032209 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a3edaa1a-d213-473f-963a-3bfea41226ec-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.032379 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a3edaa1a-d213-473f-963a-3bfea41226ec-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.033134 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a3edaa1a-d213-473f-963a-3bfea41226ec-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.034745 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a3edaa1a-d213-473f-963a-3bfea41226ec-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.042992 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a3edaa1a-d213-473f-963a-3bfea41226ec-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.043397 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a3edaa1a-d213-473f-963a-3bfea41226ec-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.045301 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4mxr\" (UniqueName: \"kubernetes.io/projected/a3edaa1a-d213-473f-963a-3bfea41226ec-kube-api-access-d4mxr\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.053036 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"a3edaa1a-d213-473f-963a-3bfea41226ec\") " pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.085337 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.087498 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.092773 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.093984 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.094106 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-m5zdm" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.094226 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.094330 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.094509 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.094593 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.109447 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.130873 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.232783 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97f813f5-f34c-4b82-b066-032f8b795049-pod-info\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.232832 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-server-conf\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.232856 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.232891 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.232928 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.232954 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.232986 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.233006 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrl8p\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-kube-api-access-lrl8p\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.233033 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97f813f5-f34c-4b82-b066-032f8b795049-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.233056 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.233085 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-config-data\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.334395 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.334451 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-config-data\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.334512 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97f813f5-f34c-4b82-b066-032f8b795049-pod-info\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.334528 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-server-conf\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.334546 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.334572 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.334601 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.334619 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.334646 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.334678 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrl8p\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-kube-api-access-lrl8p\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.334698 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97f813f5-f34c-4b82-b066-032f8b795049-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.335646 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.336117 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.336353 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.337364 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-config-data\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.337542 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-server-conf\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.337755 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.341622 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.341971 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97f813f5-f34c-4b82-b066-032f8b795049-pod-info\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.342530 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.347585 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97f813f5-f34c-4b82-b066-032f8b795049-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.357049 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrl8p\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-kube-api-access-lrl8p\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.360689 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " pod="openstack/rabbitmq-server-0" Oct 13 18:29:20 crc kubenswrapper[4974]: I1013 18:29:20.426738 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.777706 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.782002 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.785885 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.789137 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.789741 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.790792 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.791027 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-ngg9f" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.791177 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.798281 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.860477 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/72dd5704-4623-4186-8914-512c4ea61a5b-config-data-default\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.860751 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/72dd5704-4623-4186-8914-512c4ea61a5b-secrets\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.860787 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/72dd5704-4623-4186-8914-512c4ea61a5b-kolla-config\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.860818 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wxt9\" (UniqueName: \"kubernetes.io/projected/72dd5704-4623-4186-8914-512c4ea61a5b-kube-api-access-6wxt9\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.860912 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/72dd5704-4623-4186-8914-512c4ea61a5b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.860942 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/72dd5704-4623-4186-8914-512c4ea61a5b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.860971 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72dd5704-4623-4186-8914-512c4ea61a5b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.861034 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dd5704-4623-4186-8914-512c4ea61a5b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.861097 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.962436 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/72dd5704-4623-4186-8914-512c4ea61a5b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.962483 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72dd5704-4623-4186-8914-512c4ea61a5b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.962503 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/72dd5704-4623-4186-8914-512c4ea61a5b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.962554 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dd5704-4623-4186-8914-512c4ea61a5b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.962630 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.962684 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/72dd5704-4623-4186-8914-512c4ea61a5b-config-data-default\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.962709 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/72dd5704-4623-4186-8914-512c4ea61a5b-secrets\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.962726 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/72dd5704-4623-4186-8914-512c4ea61a5b-kolla-config\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.962746 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wxt9\" (UniqueName: \"kubernetes.io/projected/72dd5704-4623-4186-8914-512c4ea61a5b-kube-api-access-6wxt9\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.963374 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.963787 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/72dd5704-4623-4186-8914-512c4ea61a5b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.964906 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/72dd5704-4623-4186-8914-512c4ea61a5b-kolla-config\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.965292 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/72dd5704-4623-4186-8914-512c4ea61a5b-config-data-default\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.966424 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/72dd5704-4623-4186-8914-512c4ea61a5b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.967958 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/72dd5704-4623-4186-8914-512c4ea61a5b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.974505 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72dd5704-4623-4186-8914-512c4ea61a5b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.987013 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wxt9\" (UniqueName: \"kubernetes.io/projected/72dd5704-4623-4186-8914-512c4ea61a5b-kube-api-access-6wxt9\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.987374 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/72dd5704-4623-4186-8914-512c4ea61a5b-secrets\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:21 crc kubenswrapper[4974]: I1013 18:29:21.988598 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"72dd5704-4623-4186-8914-512c4ea61a5b\") " pod="openstack/openstack-galera-0" Oct 13 18:29:22 crc kubenswrapper[4974]: I1013 18:29:22.153266 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.224936 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.226512 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.230106 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-cskms" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.230365 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.230566 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.230762 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.240866 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.282530 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxnl9\" (UniqueName: \"kubernetes.io/projected/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-kube-api-access-bxnl9\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.282791 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.282816 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.282838 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.283012 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.283065 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.283095 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.283165 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.283212 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.380226 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.383091 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.389333 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-lj9xr" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.389711 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.390429 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.391111 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.391187 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.391350 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxnl9\" (UniqueName: \"kubernetes.io/projected/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-kube-api-access-bxnl9\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.391398 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.391456 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.391496 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.391542 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.391573 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.391601 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.395066 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.395357 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.396271 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.406086 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.411243 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.412545 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.393646 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.422724 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.423751 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.465242 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxnl9\" (UniqueName: \"kubernetes.io/projected/0c7ef8b9-b24d-4ddf-b764-41cbd10095e8-kube-api-access-bxnl9\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.473777 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-cell1-galera-0\" (UID: \"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8\") " pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.492557 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/def34e48-c96a-4074-8780-44ba062e6816-memcached-tls-certs\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.492614 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/def34e48-c96a-4074-8780-44ba062e6816-kolla-config\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.492692 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/def34e48-c96a-4074-8780-44ba062e6816-combined-ca-bundle\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.492730 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/def34e48-c96a-4074-8780-44ba062e6816-config-data\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.492765 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9td8v\" (UniqueName: \"kubernetes.io/projected/def34e48-c96a-4074-8780-44ba062e6816-kube-api-access-9td8v\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.553662 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.594190 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/def34e48-c96a-4074-8780-44ba062e6816-combined-ca-bundle\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.594240 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/def34e48-c96a-4074-8780-44ba062e6816-config-data\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.594279 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9td8v\" (UniqueName: \"kubernetes.io/projected/def34e48-c96a-4074-8780-44ba062e6816-kube-api-access-9td8v\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.594309 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/def34e48-c96a-4074-8780-44ba062e6816-memcached-tls-certs\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.594333 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/def34e48-c96a-4074-8780-44ba062e6816-kolla-config\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.595933 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/def34e48-c96a-4074-8780-44ba062e6816-kolla-config\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.597082 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/def34e48-c96a-4074-8780-44ba062e6816-config-data\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.620124 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9td8v\" (UniqueName: \"kubernetes.io/projected/def34e48-c96a-4074-8780-44ba062e6816-kube-api-access-9td8v\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.622315 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/def34e48-c96a-4074-8780-44ba062e6816-memcached-tls-certs\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.624937 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/def34e48-c96a-4074-8780-44ba062e6816-combined-ca-bundle\") pod \"memcached-0\" (UID: \"def34e48-c96a-4074-8780-44ba062e6816\") " pod="openstack/memcached-0" Oct 13 18:29:23 crc kubenswrapper[4974]: I1013 18:29:23.831145 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Oct 13 18:29:25 crc kubenswrapper[4974]: I1013 18:29:25.229569 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 18:29:25 crc kubenswrapper[4974]: I1013 18:29:25.230939 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 18:29:25 crc kubenswrapper[4974]: I1013 18:29:25.237420 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-lrn52" Oct 13 18:29:25 crc kubenswrapper[4974]: I1013 18:29:25.242264 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 18:29:25 crc kubenswrapper[4974]: I1013 18:29:25.320205 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkxtg\" (UniqueName: \"kubernetes.io/projected/cfee0cca-5b93-4a97-acea-52b40d1e5a6b-kube-api-access-dkxtg\") pod \"kube-state-metrics-0\" (UID: \"cfee0cca-5b93-4a97-acea-52b40d1e5a6b\") " pod="openstack/kube-state-metrics-0" Oct 13 18:29:25 crc kubenswrapper[4974]: I1013 18:29:25.422264 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkxtg\" (UniqueName: \"kubernetes.io/projected/cfee0cca-5b93-4a97-acea-52b40d1e5a6b-kube-api-access-dkxtg\") pod \"kube-state-metrics-0\" (UID: \"cfee0cca-5b93-4a97-acea-52b40d1e5a6b\") " pod="openstack/kube-state-metrics-0" Oct 13 18:29:25 crc kubenswrapper[4974]: I1013 18:29:25.451554 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79f9c66c79-svxxk"] Oct 13 18:29:25 crc kubenswrapper[4974]: I1013 18:29:25.478974 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkxtg\" (UniqueName: \"kubernetes.io/projected/cfee0cca-5b93-4a97-acea-52b40d1e5a6b-kube-api-access-dkxtg\") pod \"kube-state-metrics-0\" (UID: \"cfee0cca-5b93-4a97-acea-52b40d1e5a6b\") " pod="openstack/kube-state-metrics-0" Oct 13 18:29:25 crc kubenswrapper[4974]: I1013 18:29:25.581054 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.538144 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.540630 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.542276 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.542781 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-97gxg" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.542790 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.542881 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.543254 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.550515 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.557342 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.653104 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.653180 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-config\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.653208 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.653266 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gmkn6\" (UniqueName: \"kubernetes.io/projected/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-kube-api-access-gmkn6\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.653292 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.653383 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.653440 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.653581 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.755520 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.755603 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.755641 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.755684 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.755728 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-config\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.755748 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.755769 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.755796 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gmkn6\" (UniqueName: \"kubernetes.io/projected/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-kube-api-access-gmkn6\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.759580 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.760308 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.761585 4974 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.761706 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e453070dec09825da50fcd48128605195703a3e04c8868309f22a520ea4896c6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.763351 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.764481 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-config\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.765814 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.766916 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.778356 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gmkn6\" (UniqueName: \"kubernetes.io/projected/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-kube-api-access-gmkn6\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.806575 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"prometheus-metric-storage-0\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:26 crc kubenswrapper[4974]: I1013 18:29:26.866245 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.227771 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r5pdv"] Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.232420 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.240053 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.240317 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-49sfm" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.241460 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.245198 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r5pdv"] Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.255615 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-cxs58"] Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.257497 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.307477 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c233290a-abb8-4429-8500-f4ec541ccc21-combined-ca-bundle\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.307783 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d24d9e9c-90a1-490b-80d9-4d36d6050083-etc-ovs\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.307818 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjq28\" (UniqueName: \"kubernetes.io/projected/c233290a-abb8-4429-8500-f4ec541ccc21-kube-api-access-sjq28\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.307841 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c233290a-abb8-4429-8500-f4ec541ccc21-var-run-ovn\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.307862 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c233290a-abb8-4429-8500-f4ec541ccc21-ovn-controller-tls-certs\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.307894 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d24d9e9c-90a1-490b-80d9-4d36d6050083-var-lib\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.307908 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d24d9e9c-90a1-490b-80d9-4d36d6050083-scripts\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.307938 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c233290a-abb8-4429-8500-f4ec541ccc21-scripts\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.307964 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87ndq\" (UniqueName: \"kubernetes.io/projected/d24d9e9c-90a1-490b-80d9-4d36d6050083-kube-api-access-87ndq\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.308157 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c233290a-abb8-4429-8500-f4ec541ccc21-var-log-ovn\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.308191 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c233290a-abb8-4429-8500-f4ec541ccc21-var-run\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.308222 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d24d9e9c-90a1-490b-80d9-4d36d6050083-var-run\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.308251 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d24d9e9c-90a1-490b-80d9-4d36d6050083-var-log\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.309093 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cxs58"] Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409198 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c233290a-abb8-4429-8500-f4ec541ccc21-combined-ca-bundle\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409250 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d24d9e9c-90a1-490b-80d9-4d36d6050083-etc-ovs\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409278 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjq28\" (UniqueName: \"kubernetes.io/projected/c233290a-abb8-4429-8500-f4ec541ccc21-kube-api-access-sjq28\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409298 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c233290a-abb8-4429-8500-f4ec541ccc21-var-run-ovn\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409317 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c233290a-abb8-4429-8500-f4ec541ccc21-ovn-controller-tls-certs\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409344 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d24d9e9c-90a1-490b-80d9-4d36d6050083-var-lib\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409360 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d24d9e9c-90a1-490b-80d9-4d36d6050083-scripts\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409387 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c233290a-abb8-4429-8500-f4ec541ccc21-scripts\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409411 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87ndq\" (UniqueName: \"kubernetes.io/projected/d24d9e9c-90a1-490b-80d9-4d36d6050083-kube-api-access-87ndq\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409437 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c233290a-abb8-4429-8500-f4ec541ccc21-var-log-ovn\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409457 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c233290a-abb8-4429-8500-f4ec541ccc21-var-run\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409485 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d24d9e9c-90a1-490b-80d9-4d36d6050083-var-run\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.409510 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d24d9e9c-90a1-490b-80d9-4d36d6050083-var-log\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.410054 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/d24d9e9c-90a1-490b-80d9-4d36d6050083-var-log\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.411156 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c233290a-abb8-4429-8500-f4ec541ccc21-var-log-ovn\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.413667 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c233290a-abb8-4429-8500-f4ec541ccc21-scripts\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.414138 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c233290a-abb8-4429-8500-f4ec541ccc21-var-run-ovn\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.414281 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/d24d9e9c-90a1-490b-80d9-4d36d6050083-etc-ovs\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.414533 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c233290a-abb8-4429-8500-f4ec541ccc21-var-run\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.414739 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/d24d9e9c-90a1-490b-80d9-4d36d6050083-var-run\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.414917 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/d24d9e9c-90a1-490b-80d9-4d36d6050083-var-lib\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.419172 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c233290a-abb8-4429-8500-f4ec541ccc21-ovn-controller-tls-certs\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.429507 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d24d9e9c-90a1-490b-80d9-4d36d6050083-scripts\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.431323 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c233290a-abb8-4429-8500-f4ec541ccc21-combined-ca-bundle\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.436372 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjq28\" (UniqueName: \"kubernetes.io/projected/c233290a-abb8-4429-8500-f4ec541ccc21-kube-api-access-sjq28\") pod \"ovn-controller-r5pdv\" (UID: \"c233290a-abb8-4429-8500-f4ec541ccc21\") " pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.437620 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87ndq\" (UniqueName: \"kubernetes.io/projected/d24d9e9c-90a1-490b-80d9-4d36d6050083-kube-api-access-87ndq\") pod \"ovn-controller-ovs-cxs58\" (UID: \"d24d9e9c-90a1-490b-80d9-4d36d6050083\") " pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.614763 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:29 crc kubenswrapper[4974]: I1013 18:29:29.629034 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.216488 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.218915 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.222200 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.222481 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.222645 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.222748 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-dqxk6" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.224862 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.234939 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.322943 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxwcv\" (UniqueName: \"kubernetes.io/projected/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-kube-api-access-mxwcv\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.323009 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.323039 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-config\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.323065 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.323105 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.323152 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.323191 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.323216 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.424872 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxwcv\" (UniqueName: \"kubernetes.io/projected/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-kube-api-access-mxwcv\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.425200 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.425221 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-config\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.425239 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.425271 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.425310 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.425338 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.425356 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.426367 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.426898 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-config\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.426966 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.426378 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.433986 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.437216 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.437246 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.454975 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxwcv\" (UniqueName: \"kubernetes.io/projected/63c7b046-cd5e-42e0-b295-ca90bb6a53c9-kube-api-access-mxwcv\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.486032 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"63c7b046-cd5e-42e0-b295-ca90bb6a53c9\") " pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:30 crc kubenswrapper[4974]: I1013 18:29:30.548503 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:31 crc kubenswrapper[4974]: I1013 18:29:31.026270 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Oct 13 18:29:31 crc kubenswrapper[4974]: I1013 18:29:31.305068 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" event={"ID":"6887cfd9-80d7-43bc-bcb6-01b264f72d5a","Type":"ContainerStarted","Data":"4697d6348cc2253566f6dca8031f337d66078cdd9f7d8aad43a52da1d52576cf"} Oct 13 18:29:31 crc kubenswrapper[4974]: W1013 18:29:31.422199 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3edaa1a_d213_473f_963a_3bfea41226ec.slice/crio-d4c4212f50eeea8d58730926e7d187217d4920a08aaa660bf8be460ba5bbd284 WatchSource:0}: Error finding container d4c4212f50eeea8d58730926e7d187217d4920a08aaa660bf8be460ba5bbd284: Status 404 returned error can't find the container with id d4c4212f50eeea8d58730926e7d187217d4920a08aaa660bf8be460ba5bbd284 Oct 13 18:29:31 crc kubenswrapper[4974]: E1013 18:29:31.491109 4974 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 13 18:29:31 crc kubenswrapper[4974]: E1013 18:29:31.491163 4974 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 13 18:29:31 crc kubenswrapper[4974]: E1013 18:29:31.491278 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.119:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m244n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8ff9c764f-zjfr2_openstack(348ed400-353d-42f7-86e3-b6360bba1408): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 18:29:31 crc kubenswrapper[4974]: E1013 18:29:31.502335 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" podUID="348ed400-353d-42f7-86e3-b6360bba1408" Oct 13 18:29:31 crc kubenswrapper[4974]: E1013 18:29:31.507388 4974 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 13 18:29:31 crc kubenswrapper[4974]: E1013 18:29:31.507426 4974 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Oct 13 18:29:31 crc kubenswrapper[4974]: E1013 18:29:31.507556 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.119:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nghsn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-68587c85b9-rcsdn_openstack(7f46dab1-c765-4cbb-8f0d-41071cf7bc1a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 18:29:31 crc kubenswrapper[4974]: E1013 18:29:31.508765 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" podUID="7f46dab1-c765-4cbb-8f0d-41071cf7bc1a" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.048952 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5788458d7f-4dccc"] Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.078464 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 18:29:32 crc kubenswrapper[4974]: W1013 18:29:32.081845 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0d4fdc3_bd26_4379_8dfd_f6f3ce70a24f.slice/crio-f1f13a63f8c6e3d47b4841e911f465ed65f9664104e5af973100b4f6699109e9 WatchSource:0}: Error finding container f1f13a63f8c6e3d47b4841e911f465ed65f9664104e5af973100b4f6699109e9: Status 404 returned error can't find the container with id f1f13a63f8c6e3d47b4841e911f465ed65f9664104e5af973100b4f6699109e9 Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.196954 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 18:29:32 crc kubenswrapper[4974]: W1013 18:29:32.201778 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97f813f5_f34c_4b82_b066_032f8b795049.slice/crio-00c9001668603c5a423d64f2b2e013ac2195c48e85bd1d24d991cc31dd729db9 WatchSource:0}: Error finding container 00c9001668603c5a423d64f2b2e013ac2195c48e85bd1d24d991cc31dd729db9: Status 404 returned error can't find the container with id 00c9001668603c5a423d64f2b2e013ac2195c48e85bd1d24d991cc31dd729db9 Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.203081 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69f8f5886f-tmcvm"] Oct 13 18:29:32 crc kubenswrapper[4974]: W1013 18:29:32.209140 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbb5a5f0_3887_4c04_9a74_bba1bcf4b9e0.slice/crio-81f888f71e23daf5644b64e52a8720c60c84a0f045f0166df64249934f9b6a79 WatchSource:0}: Error finding container 81f888f71e23daf5644b64e52a8720c60c84a0f045f0166df64249934f9b6a79: Status 404 returned error can't find the container with id 81f888f71e23daf5644b64e52a8720c60c84a0f045f0166df64249934f9b6a79 Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.318457 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97f813f5-f34c-4b82-b066-032f8b795049","Type":"ContainerStarted","Data":"00c9001668603c5a423d64f2b2e013ac2195c48e85bd1d24d991cc31dd729db9"} Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.320758 4974 generic.go:334] "Generic (PLEG): container finished" podID="6887cfd9-80d7-43bc-bcb6-01b264f72d5a" containerID="975a605a596af28fdca8e13741a0c5c34b69945fdd304b6a7a5063aa3d8efe04" exitCode=0 Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.320815 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" event={"ID":"6887cfd9-80d7-43bc-bcb6-01b264f72d5a","Type":"ContainerDied","Data":"975a605a596af28fdca8e13741a0c5c34b69945fdd304b6a7a5063aa3d8efe04"} Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.322826 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f","Type":"ContainerStarted","Data":"f1f13a63f8c6e3d47b4841e911f465ed65f9664104e5af973100b4f6699109e9"} Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.325468 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"a3edaa1a-d213-473f-963a-3bfea41226ec","Type":"ContainerStarted","Data":"d4c4212f50eeea8d58730926e7d187217d4920a08aaa660bf8be460ba5bbd284"} Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.327789 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" event={"ID":"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0","Type":"ContainerStarted","Data":"81f888f71e23daf5644b64e52a8720c60c84a0f045f0166df64249934f9b6a79"} Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.329936 4974 generic.go:334] "Generic (PLEG): container finished" podID="587562fd-5751-4c44-970b-bccd97cf59b2" containerID="ffd8ff01b5ec0df9c2fa98bd4d0668909955f9669bfee6e278df5cac68382653" exitCode=0 Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.329996 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5788458d7f-4dccc" event={"ID":"587562fd-5751-4c44-970b-bccd97cf59b2","Type":"ContainerDied","Data":"ffd8ff01b5ec0df9c2fa98bd4d0668909955f9669bfee6e278df5cac68382653"} Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.330045 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5788458d7f-4dccc" event={"ID":"587562fd-5751-4c44-970b-bccd97cf59b2","Type":"ContainerStarted","Data":"0ce7345767fd1be91db2d6407e2956252ebef87259d4c1f4bd892582d46e4fdf"} Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.511223 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.519126 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.535452 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.544100 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.551046 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.574556 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r5pdv"] Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.636166 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cxs58"] Oct 13 18:29:32 crc kubenswrapper[4974]: W1013 18:29:32.708498 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd24d9e9c_90a1_490b_80d9_4d36d6050083.slice/crio-247ec795fc9973bd27f701db80f7ffbfddbbf80481fd38a20ce8dfb1ab219169 WatchSource:0}: Error finding container 247ec795fc9973bd27f701db80f7ffbfddbbf80481fd38a20ce8dfb1ab219169: Status 404 returned error can't find the container with id 247ec795fc9973bd27f701db80f7ffbfddbbf80481fd38a20ce8dfb1ab219169 Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.803618 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.806040 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.809722 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.814845 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.815122 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.815475 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.815780 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-pdbt9" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.893280 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.913622 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m244n\" (UniqueName: \"kubernetes.io/projected/348ed400-353d-42f7-86e3-b6360bba1408-kube-api-access-m244n\") pod \"348ed400-353d-42f7-86e3-b6360bba1408\" (UID: \"348ed400-353d-42f7-86e3-b6360bba1408\") " Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.913726 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348ed400-353d-42f7-86e3-b6360bba1408-config\") pod \"348ed400-353d-42f7-86e3-b6360bba1408\" (UID: \"348ed400-353d-42f7-86e3-b6360bba1408\") " Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.913901 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wrkj\" (UniqueName: \"kubernetes.io/projected/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-kube-api-access-2wrkj\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.913986 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.914038 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.914087 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.914111 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-config\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.914130 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.914165 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.914188 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.914975 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.916066 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/348ed400-353d-42f7-86e3-b6360bba1408-config" (OuterVolumeSpecName: "config") pod "348ed400-353d-42f7-86e3-b6360bba1408" (UID: "348ed400-353d-42f7-86e3-b6360bba1408"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.924897 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/348ed400-353d-42f7-86e3-b6360bba1408-kube-api-access-m244n" (OuterVolumeSpecName: "kube-api-access-m244n") pod "348ed400-353d-42f7-86e3-b6360bba1408" (UID: "348ed400-353d-42f7-86e3-b6360bba1408"). InnerVolumeSpecName "kube-api-access-m244n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:29:32 crc kubenswrapper[4974]: I1013 18:29:32.934416 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.015246 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmbmt\" (UniqueName: \"kubernetes.io/projected/587562fd-5751-4c44-970b-bccd97cf59b2-kube-api-access-xmbmt\") pod \"587562fd-5751-4c44-970b-bccd97cf59b2\" (UID: \"587562fd-5751-4c44-970b-bccd97cf59b2\") " Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.015302 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-dns-svc\") pod \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\" (UID: \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\") " Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.015335 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/587562fd-5751-4c44-970b-bccd97cf59b2-dns-svc\") pod \"587562fd-5751-4c44-970b-bccd97cf59b2\" (UID: \"587562fd-5751-4c44-970b-bccd97cf59b2\") " Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.015455 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587562fd-5751-4c44-970b-bccd97cf59b2-config\") pod \"587562fd-5751-4c44-970b-bccd97cf59b2\" (UID: \"587562fd-5751-4c44-970b-bccd97cf59b2\") " Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.015487 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-config\") pod \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\" (UID: \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\") " Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.015627 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nghsn\" (UniqueName: \"kubernetes.io/projected/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-kube-api-access-nghsn\") pod \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\" (UID: \"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a\") " Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.015871 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.015917 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.015957 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.015977 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-config\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.015993 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.016015 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.016034 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.016059 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wrkj\" (UniqueName: \"kubernetes.io/projected/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-kube-api-access-2wrkj\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.016135 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m244n\" (UniqueName: \"kubernetes.io/projected/348ed400-353d-42f7-86e3-b6360bba1408-kube-api-access-m244n\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.016145 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348ed400-353d-42f7-86e3-b6360bba1408-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.016343 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f46dab1-c765-4cbb-8f0d-41071cf7bc1a" (UID: "7f46dab1-c765-4cbb-8f0d-41071cf7bc1a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.017626 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.018229 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.018417 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.019625 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-config\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.020121 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-config" (OuterVolumeSpecName: "config") pod "7f46dab1-c765-4cbb-8f0d-41071cf7bc1a" (UID: "7f46dab1-c765-4cbb-8f0d-41071cf7bc1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.021103 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.023053 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/587562fd-5751-4c44-970b-bccd97cf59b2-kube-api-access-xmbmt" (OuterVolumeSpecName: "kube-api-access-xmbmt") pod "587562fd-5751-4c44-970b-bccd97cf59b2" (UID: "587562fd-5751-4c44-970b-bccd97cf59b2"). InnerVolumeSpecName "kube-api-access-xmbmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.023100 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.023991 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.026396 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-kube-api-access-nghsn" (OuterVolumeSpecName: "kube-api-access-nghsn") pod "7f46dab1-c765-4cbb-8f0d-41071cf7bc1a" (UID: "7f46dab1-c765-4cbb-8f0d-41071cf7bc1a"). InnerVolumeSpecName "kube-api-access-nghsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.030563 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wrkj\" (UniqueName: \"kubernetes.io/projected/eede053f-a3cf-4af6-9f1a-7458bec6f5a3-kube-api-access-2wrkj\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.045340 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/587562fd-5751-4c44-970b-bccd97cf59b2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "587562fd-5751-4c44-970b-bccd97cf59b2" (UID: "587562fd-5751-4c44-970b-bccd97cf59b2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.048887 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"ovsdbserver-nb-0\" (UID: \"eede053f-a3cf-4af6-9f1a-7458bec6f5a3\") " pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.050260 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/587562fd-5751-4c44-970b-bccd97cf59b2-config" (OuterVolumeSpecName: "config") pod "587562fd-5751-4c44-970b-bccd97cf59b2" (UID: "587562fd-5751-4c44-970b-bccd97cf59b2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.118624 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nghsn\" (UniqueName: \"kubernetes.io/projected/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-kube-api-access-nghsn\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.118674 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmbmt\" (UniqueName: \"kubernetes.io/projected/587562fd-5751-4c44-970b-bccd97cf59b2-kube-api-access-xmbmt\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.118686 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.118697 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/587562fd-5751-4c44-970b-bccd97cf59b2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.118705 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/587562fd-5751-4c44-970b-bccd97cf59b2-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.118714 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.162845 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.356432 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9981a58f-bf58-44c5-9da9-fbe0ef9005a9","Type":"ContainerStarted","Data":"31297244d1308c0d63f1a2d04fc79b889566cc41398ad62d52beb138177ea30a"} Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.364502 4974 generic.go:334] "Generic (PLEG): container finished" podID="cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" containerID="8370d4b131a607a2521882cc92d66028179b1ddd55d727e7164ede6ab958614a" exitCode=0 Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.364598 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" event={"ID":"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0","Type":"ContainerDied","Data":"8370d4b131a607a2521882cc92d66028179b1ddd55d727e7164ede6ab958614a"} Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.367344 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8","Type":"ContainerStarted","Data":"16db9b059b0114aa969221ac73e5699b159369ac370270c0ef5267b5302e3973"} Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.370931 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5788458d7f-4dccc" event={"ID":"587562fd-5751-4c44-970b-bccd97cf59b2","Type":"ContainerDied","Data":"0ce7345767fd1be91db2d6407e2956252ebef87259d4c1f4bd892582d46e4fdf"} Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.370981 4974 scope.go:117] "RemoveContainer" containerID="ffd8ff01b5ec0df9c2fa98bd4d0668909955f9669bfee6e278df5cac68382653" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.371629 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5788458d7f-4dccc" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.383096 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv" event={"ID":"c233290a-abb8-4429-8500-f4ec541ccc21","Type":"ContainerStarted","Data":"0d1a94e507be3a60a6781fffacae8c6fc3ebc76d0365b0fa0a49e02423de6d55"} Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.388486 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" event={"ID":"6887cfd9-80d7-43bc-bcb6-01b264f72d5a","Type":"ContainerStarted","Data":"0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d"} Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.389329 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.392749 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" event={"ID":"348ed400-353d-42f7-86e3-b6360bba1408","Type":"ContainerDied","Data":"95c846d8c5c9be1d6a3975e755b406c8410847bca3b4be0f9faa7c6f1466b2a6"} Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.392819 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8ff9c764f-zjfr2" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.394529 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"def34e48-c96a-4074-8780-44ba062e6816","Type":"ContainerStarted","Data":"7e6844489e4f7ce040900931ad630a39c4bfe71e45170398d7b5e97695fc45a6"} Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.395332 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" event={"ID":"7f46dab1-c765-4cbb-8f0d-41071cf7bc1a","Type":"ContainerDied","Data":"dc484265b9d97af21ea0b2e231adeb8392820808f77d4eca0429375434de572e"} Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.395395 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68587c85b9-rcsdn" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.401264 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"72dd5704-4623-4186-8914-512c4ea61a5b","Type":"ContainerStarted","Data":"dbfa4df445241b639e5caa5e7e773b4282db2e09fffcc2138a5817b197d2d5f9"} Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.413200 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cxs58" event={"ID":"d24d9e9c-90a1-490b-80d9-4d36d6050083","Type":"ContainerStarted","Data":"247ec795fc9973bd27f701db80f7ffbfddbbf80481fd38a20ce8dfb1ab219169"} Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.413879 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" podStartSLOduration=14.308378654 podStartE2EDuration="15.413866524s" podCreationTimestamp="2025-10-13 18:29:18 +0000 UTC" firstStartedPulling="2025-10-13 18:29:30.600820338 +0000 UTC m=+905.505186418" lastFinishedPulling="2025-10-13 18:29:31.706308208 +0000 UTC m=+906.610674288" observedRunningTime="2025-10-13 18:29:33.412015872 +0000 UTC m=+908.316381952" watchObservedRunningTime="2025-10-13 18:29:33.413866524 +0000 UTC m=+908.318232604" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.415970 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cfee0cca-5b93-4a97-acea-52b40d1e5a6b","Type":"ContainerStarted","Data":"bc233a57ea6a107b65dc4ed0c35fd58ad2bb9a76284a5e0d8bebd3a4d97514a8"} Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.474187 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5788458d7f-4dccc"] Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.480336 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5788458d7f-4dccc"] Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.506547 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68587c85b9-rcsdn"] Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.516185 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68587c85b9-rcsdn"] Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.551560 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8ff9c764f-zjfr2"] Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.556526 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8ff9c764f-zjfr2"] Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.613644 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Oct 13 18:29:33 crc kubenswrapper[4974]: W1013 18:29:33.629428 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63c7b046_cd5e_42e0_b295_ca90bb6a53c9.slice/crio-13158e29abea5e13af48498f0a860c895fb3a858026ec182660f83b42d6adf58 WatchSource:0}: Error finding container 13158e29abea5e13af48498f0a860c895fb3a858026ec182660f83b42d6adf58: Status 404 returned error can't find the container with id 13158e29abea5e13af48498f0a860c895fb3a858026ec182660f83b42d6adf58 Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.848824 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="348ed400-353d-42f7-86e3-b6360bba1408" path="/var/lib/kubelet/pods/348ed400-353d-42f7-86e3-b6360bba1408/volumes" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.849703 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="587562fd-5751-4c44-970b-bccd97cf59b2" path="/var/lib/kubelet/pods/587562fd-5751-4c44-970b-bccd97cf59b2/volumes" Oct 13 18:29:33 crc kubenswrapper[4974]: I1013 18:29:33.850401 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f46dab1-c765-4cbb-8f0d-41071cf7bc1a" path="/var/lib/kubelet/pods/7f46dab1-c765-4cbb-8f0d-41071cf7bc1a/volumes" Oct 13 18:29:34 crc kubenswrapper[4974]: I1013 18:29:34.179831 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Oct 13 18:29:34 crc kubenswrapper[4974]: I1013 18:29:34.442089 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"63c7b046-cd5e-42e0-b295-ca90bb6a53c9","Type":"ContainerStarted","Data":"13158e29abea5e13af48498f0a860c895fb3a858026ec182660f83b42d6adf58"} Oct 13 18:29:37 crc kubenswrapper[4974]: I1013 18:29:37.467377 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eede053f-a3cf-4af6-9f1a-7458bec6f5a3","Type":"ContainerStarted","Data":"5bab1d2803810007e4dd946b71cb0391f868b0063edcb9b129d8e80bdcb15106"} Oct 13 18:29:38 crc kubenswrapper[4974]: I1013 18:29:38.978908 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:29:42 crc kubenswrapper[4974]: I1013 18:29:42.521852 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" event={"ID":"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0","Type":"ContainerStarted","Data":"8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b"} Oct 13 18:29:42 crc kubenswrapper[4974]: I1013 18:29:42.522420 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:42 crc kubenswrapper[4974]: I1013 18:29:42.562940 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" podStartSLOduration=24.562912368 podStartE2EDuration="24.562912368s" podCreationTimestamp="2025-10-13 18:29:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:29:42.545711813 +0000 UTC m=+917.450077973" watchObservedRunningTime="2025-10-13 18:29:42.562912368 +0000 UTC m=+917.467278488" Oct 13 18:29:43 crc kubenswrapper[4974]: I1013 18:29:43.533378 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8","Type":"ContainerStarted","Data":"c910392928be2935ded385b664d3e1cb4831b06df52c7d9a16b94a5683ac6ff5"} Oct 13 18:29:43 crc kubenswrapper[4974]: I1013 18:29:43.535709 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"def34e48-c96a-4074-8780-44ba062e6816","Type":"ContainerStarted","Data":"c4e04882ee887b61bf59f2b58146ffce0562e84420336d364cfa9ac9bedeed88"} Oct 13 18:29:43 crc kubenswrapper[4974]: I1013 18:29:43.536883 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Oct 13 18:29:43 crc kubenswrapper[4974]: I1013 18:29:43.540929 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"72dd5704-4623-4186-8914-512c4ea61a5b","Type":"ContainerStarted","Data":"541109e50e5371d91a61cd942643a0864812e78e3186164859cc0f7d7e14fabc"} Oct 13 18:29:43 crc kubenswrapper[4974]: I1013 18:29:43.544954 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cfee0cca-5b93-4a97-acea-52b40d1e5a6b","Type":"ContainerStarted","Data":"dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267"} Oct 13 18:29:43 crc kubenswrapper[4974]: I1013 18:29:43.601384 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.064306193 podStartE2EDuration="18.60135954s" podCreationTimestamp="2025-10-13 18:29:25 +0000 UTC" firstStartedPulling="2025-10-13 18:29:32.563987785 +0000 UTC m=+907.468353865" lastFinishedPulling="2025-10-13 18:29:43.101041122 +0000 UTC m=+918.005407212" observedRunningTime="2025-10-13 18:29:43.592521511 +0000 UTC m=+918.496887621" watchObservedRunningTime="2025-10-13 18:29:43.60135954 +0000 UTC m=+918.505725620" Oct 13 18:29:44 crc kubenswrapper[4974]: I1013 18:29:44.562236 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv" event={"ID":"c233290a-abb8-4429-8500-f4ec541ccc21","Type":"ContainerStarted","Data":"6668b7afd3b5ba5d2bf41a0a541e6c6b7087f62155f631592361cfe8839ababd"} Oct 13 18:29:44 crc kubenswrapper[4974]: I1013 18:29:44.562602 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-r5pdv" Oct 13 18:29:44 crc kubenswrapper[4974]: I1013 18:29:44.564925 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"63c7b046-cd5e-42e0-b295-ca90bb6a53c9","Type":"ContainerStarted","Data":"df66921ad2b802425ac9dc76fc1f4bed231f2ab332e8c8248f8de48bce109308"} Oct 13 18:29:44 crc kubenswrapper[4974]: I1013 18:29:44.568736 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eede053f-a3cf-4af6-9f1a-7458bec6f5a3","Type":"ContainerStarted","Data":"1c54064057204823a9705aec36ae28d05212e8936cd5345d7c82e83d8e6adfd9"} Oct 13 18:29:44 crc kubenswrapper[4974]: I1013 18:29:44.568764 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 13 18:29:44 crc kubenswrapper[4974]: I1013 18:29:44.581132 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-r5pdv" podStartSLOduration=6.041046925 podStartE2EDuration="15.581116016s" podCreationTimestamp="2025-10-13 18:29:29 +0000 UTC" firstStartedPulling="2025-10-13 18:29:32.588442047 +0000 UTC m=+907.492808127" lastFinishedPulling="2025-10-13 18:29:42.128511128 +0000 UTC m=+917.032877218" observedRunningTime="2025-10-13 18:29:44.579635825 +0000 UTC m=+919.484001895" watchObservedRunningTime="2025-10-13 18:29:44.581116016 +0000 UTC m=+919.485482096" Oct 13 18:29:44 crc kubenswrapper[4974]: I1013 18:29:44.585461 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=13.008333569 podStartE2EDuration="21.585447369s" podCreationTimestamp="2025-10-13 18:29:23 +0000 UTC" firstStartedPulling="2025-10-13 18:29:32.564480219 +0000 UTC m=+907.468846299" lastFinishedPulling="2025-10-13 18:29:41.141594019 +0000 UTC m=+916.045960099" observedRunningTime="2025-10-13 18:29:43.641450451 +0000 UTC m=+918.545816541" watchObservedRunningTime="2025-10-13 18:29:44.585447369 +0000 UTC m=+919.489813449" Oct 13 18:29:45 crc kubenswrapper[4974]: I1013 18:29:45.576752 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97f813f5-f34c-4b82-b066-032f8b795049","Type":"ContainerStarted","Data":"9329d18d7ac353654a37f475ebbc3089790ec57af55580074f0df82cc9e8e5d2"} Oct 13 18:29:45 crc kubenswrapper[4974]: I1013 18:29:45.578429 4974 generic.go:334] "Generic (PLEG): container finished" podID="d24d9e9c-90a1-490b-80d9-4d36d6050083" containerID="7a516a92d1ce17cb726461cbd455dd000613b76a7505a0d02ea7cf44940de4a9" exitCode=0 Oct 13 18:29:45 crc kubenswrapper[4974]: I1013 18:29:45.578511 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cxs58" event={"ID":"d24d9e9c-90a1-490b-80d9-4d36d6050083","Type":"ContainerDied","Data":"7a516a92d1ce17cb726461cbd455dd000613b76a7505a0d02ea7cf44940de4a9"} Oct 13 18:29:45 crc kubenswrapper[4974]: I1013 18:29:45.581037 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f","Type":"ContainerStarted","Data":"f9feccc6065d63f691d7fbab7f9eab65f08587127c273cffc84fd6e683cab7ec"} Oct 13 18:29:45 crc kubenswrapper[4974]: I1013 18:29:45.582590 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"a3edaa1a-d213-473f-963a-3bfea41226ec","Type":"ContainerStarted","Data":"0e3d86bf6a5876919060cdf9a8524e67172b2a586cadba6d8a0d3e5c0cd22c2a"} Oct 13 18:29:46 crc kubenswrapper[4974]: I1013 18:29:46.592022 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9981a58f-bf58-44c5-9da9-fbe0ef9005a9","Type":"ContainerStarted","Data":"fef82051b786278429ce9f99d7d6776b6f30b36730555d1be52d8d010a0455d8"} Oct 13 18:29:47 crc kubenswrapper[4974]: I1013 18:29:47.603293 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"eede053f-a3cf-4af6-9f1a-7458bec6f5a3","Type":"ContainerStarted","Data":"ab2b1ac751e9391e7cf3a12eef69e61dce9ce1a07eea87ce16f7fe367c512315"} Oct 13 18:29:47 crc kubenswrapper[4974]: I1013 18:29:47.605767 4974 generic.go:334] "Generic (PLEG): container finished" podID="0c7ef8b9-b24d-4ddf-b764-41cbd10095e8" containerID="c910392928be2935ded385b664d3e1cb4831b06df52c7d9a16b94a5683ac6ff5" exitCode=0 Oct 13 18:29:47 crc kubenswrapper[4974]: I1013 18:29:47.605838 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8","Type":"ContainerDied","Data":"c910392928be2935ded385b664d3e1cb4831b06df52c7d9a16b94a5683ac6ff5"} Oct 13 18:29:47 crc kubenswrapper[4974]: I1013 18:29:47.610842 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cxs58" event={"ID":"d24d9e9c-90a1-490b-80d9-4d36d6050083","Type":"ContainerStarted","Data":"6409c72f88087e4aab38e5a9e5f55e5add22e928863f3d56a076ce692c7d15fb"} Oct 13 18:29:47 crc kubenswrapper[4974]: I1013 18:29:47.610906 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:47 crc kubenswrapper[4974]: I1013 18:29:47.610929 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:29:47 crc kubenswrapper[4974]: I1013 18:29:47.610945 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cxs58" event={"ID":"d24d9e9c-90a1-490b-80d9-4d36d6050083","Type":"ContainerStarted","Data":"b1786e702a09770ecb71845e1d00a5dba19b440e6cdc76da683dc42ab00bf7d6"} Oct 13 18:29:47 crc kubenswrapper[4974]: I1013 18:29:47.613985 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"63c7b046-cd5e-42e0-b295-ca90bb6a53c9","Type":"ContainerStarted","Data":"a2e7d2a8def9f34d9bcdfeba41f6f9858f6cbbfe6913926cea4d54c8616e86d1"} Oct 13 18:29:47 crc kubenswrapper[4974]: I1013 18:29:47.638004 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=6.6880974890000005 podStartE2EDuration="16.637986115s" podCreationTimestamp="2025-10-13 18:29:31 +0000 UTC" firstStartedPulling="2025-10-13 18:29:36.895260092 +0000 UTC m=+911.799626182" lastFinishedPulling="2025-10-13 18:29:46.845148718 +0000 UTC m=+921.749514808" observedRunningTime="2025-10-13 18:29:47.634578098 +0000 UTC m=+922.538944178" watchObservedRunningTime="2025-10-13 18:29:47.637986115 +0000 UTC m=+922.542352215" Oct 13 18:29:47 crc kubenswrapper[4974]: I1013 18:29:47.672178 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-cxs58" podStartSLOduration=9.872809685 podStartE2EDuration="18.672139558s" podCreationTimestamp="2025-10-13 18:29:29 +0000 UTC" firstStartedPulling="2025-10-13 18:29:32.724437663 +0000 UTC m=+907.628803743" lastFinishedPulling="2025-10-13 18:29:41.523767546 +0000 UTC m=+916.428133616" observedRunningTime="2025-10-13 18:29:47.660967613 +0000 UTC m=+922.565333733" watchObservedRunningTime="2025-10-13 18:29:47.672139558 +0000 UTC m=+922.576505678" Oct 13 18:29:47 crc kubenswrapper[4974]: I1013 18:29:47.725551 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.532714354 podStartE2EDuration="18.725532733s" podCreationTimestamp="2025-10-13 18:29:29 +0000 UTC" firstStartedPulling="2025-10-13 18:29:33.639645681 +0000 UTC m=+908.544011761" lastFinishedPulling="2025-10-13 18:29:46.83246406 +0000 UTC m=+921.736830140" observedRunningTime="2025-10-13 18:29:47.721211821 +0000 UTC m=+922.625577901" watchObservedRunningTime="2025-10-13 18:29:47.725532733 +0000 UTC m=+922.629898813" Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.163009 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.163240 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.224823 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.549142 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.624948 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.630698 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"0c7ef8b9-b24d-4ddf-b764-41cbd10095e8","Type":"ContainerStarted","Data":"e60530d9d1538754b68ff13ddb34dada1894d819b71e52004eaf0a8739c5fb10"} Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.633025 4974 generic.go:334] "Generic (PLEG): container finished" podID="72dd5704-4623-4186-8914-512c4ea61a5b" containerID="541109e50e5371d91a61cd942643a0864812e78e3186164859cc0f7d7e14fabc" exitCode=0 Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.633167 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"72dd5704-4623-4186-8914-512c4ea61a5b","Type":"ContainerDied","Data":"541109e50e5371d91a61cd942643a0864812e78e3186164859cc0f7d7e14fabc"} Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.634998 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.732518 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.737432 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.782144 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=17.329276001 podStartE2EDuration="26.782111296s" podCreationTimestamp="2025-10-13 18:29:22 +0000 UTC" firstStartedPulling="2025-10-13 18:29:32.560427234 +0000 UTC m=+907.464793314" lastFinishedPulling="2025-10-13 18:29:42.013262529 +0000 UTC m=+916.917628609" observedRunningTime="2025-10-13 18:29:48.74323926 +0000 UTC m=+923.647605350" watchObservedRunningTime="2025-10-13 18:29:48.782111296 +0000 UTC m=+923.686477416" Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.832797 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.972522 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69f8f5886f-tmcvm"] Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.972801 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" podUID="cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" containerName="dnsmasq-dns" containerID="cri-o://8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b" gracePeriod=10 Oct 13 18:29:48 crc kubenswrapper[4974]: I1013 18:29:48.976772 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.004588 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b748cc897-lvqfd"] Oct 13 18:29:49 crc kubenswrapper[4974]: E1013 18:29:49.005063 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="587562fd-5751-4c44-970b-bccd97cf59b2" containerName="init" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.005082 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="587562fd-5751-4c44-970b-bccd97cf59b2" containerName="init" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.005272 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="587562fd-5751-4c44-970b-bccd97cf59b2" containerName="init" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.006332 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.012171 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.021162 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b748cc897-lvqfd"] Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.057194 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-ovsdbserver-nb\") pod \"dnsmasq-dns-7b748cc897-lvqfd\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.057272 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-config\") pod \"dnsmasq-dns-7b748cc897-lvqfd\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.057422 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9ct2\" (UniqueName: \"kubernetes.io/projected/bca920cc-b99a-42e1-b738-24e1294c9d87-kube-api-access-h9ct2\") pod \"dnsmasq-dns-7b748cc897-lvqfd\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.057570 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-dns-svc\") pod \"dnsmasq-dns-7b748cc897-lvqfd\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.133660 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-tvbtp"] Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.134616 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-tvbtp"] Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.134702 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.139816 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.162730 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9ct2\" (UniqueName: \"kubernetes.io/projected/bca920cc-b99a-42e1-b738-24e1294c9d87-kube-api-access-h9ct2\") pod \"dnsmasq-dns-7b748cc897-lvqfd\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.162786 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-dns-svc\") pod \"dnsmasq-dns-7b748cc897-lvqfd\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.162849 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-ovsdbserver-nb\") pod \"dnsmasq-dns-7b748cc897-lvqfd\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.162895 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-config\") pod \"dnsmasq-dns-7b748cc897-lvqfd\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.163736 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-config\") pod \"dnsmasq-dns-7b748cc897-lvqfd\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.164053 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-dns-svc\") pod \"dnsmasq-dns-7b748cc897-lvqfd\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.164813 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-ovsdbserver-nb\") pod \"dnsmasq-dns-7b748cc897-lvqfd\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.205014 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9ct2\" (UniqueName: \"kubernetes.io/projected/bca920cc-b99a-42e1-b738-24e1294c9d87-kube-api-access-h9ct2\") pod \"dnsmasq-dns-7b748cc897-lvqfd\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.207893 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b748cc897-lvqfd"] Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.209137 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.248824 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.250181 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.256278 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b6dd89997-vdvds"] Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.257078 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.257354 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-bkzs7" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.258075 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.267703 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.267928 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.268066 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.268580 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.269452 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d4633f8-1f68-4d44-8569-69f02e1886f3-combined-ca-bundle\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.269501 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2d4633f8-1f68-4d44-8569-69f02e1886f3-ovn-rundir\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.269538 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d4633f8-1f68-4d44-8569-69f02e1886f3-config\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.269588 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vj9m\" (UniqueName: \"kubernetes.io/projected/2d4633f8-1f68-4d44-8569-69f02e1886f3-kube-api-access-4vj9m\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.269715 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2d4633f8-1f68-4d44-8569-69f02e1886f3-ovs-rundir\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.270007 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d4633f8-1f68-4d44-8569-69f02e1886f3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.273485 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6dd89997-vdvds"] Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.293152 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" podUID="cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.105:5353: connect: connection refused" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371548 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d4633f8-1f68-4d44-8569-69f02e1886f3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371611 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-dns-svc\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371663 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d4633f8-1f68-4d44-8569-69f02e1886f3-combined-ca-bundle\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371682 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2d4633f8-1f68-4d44-8569-69f02e1886f3-ovn-rundir\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371731 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97fd37f3-ee6b-4a37-a32b-057d21edf416-config\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371751 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fd37f3-ee6b-4a37-a32b-057d21edf416-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371774 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d4633f8-1f68-4d44-8569-69f02e1886f3-config\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371818 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vj9m\" (UniqueName: \"kubernetes.io/projected/2d4633f8-1f68-4d44-8569-69f02e1886f3-kube-api-access-4vj9m\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371841 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97fd37f3-ee6b-4a37-a32b-057d21edf416-scripts\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371861 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcsvr\" (UniqueName: \"kubernetes.io/projected/050b7c88-3e41-445b-af12-c14a4cb66736-kube-api-access-mcsvr\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371909 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2d4633f8-1f68-4d44-8569-69f02e1886f3-ovs-rundir\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371947 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fd37f3-ee6b-4a37-a32b-057d21edf416-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.371993 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.372017 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-config\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.372055 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/97fd37f3-ee6b-4a37-a32b-057d21edf416-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.372071 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cf2sv\" (UniqueName: \"kubernetes.io/projected/97fd37f3-ee6b-4a37-a32b-057d21edf416-kube-api-access-cf2sv\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.372092 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fd37f3-ee6b-4a37-a32b-057d21edf416-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.372129 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.372851 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2d4633f8-1f68-4d44-8569-69f02e1886f3-ovn-rundir\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.372939 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2d4633f8-1f68-4d44-8569-69f02e1886f3-ovs-rundir\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.378097 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d4633f8-1f68-4d44-8569-69f02e1886f3-config\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.379284 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d4633f8-1f68-4d44-8569-69f02e1886f3-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.382368 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d4633f8-1f68-4d44-8569-69f02e1886f3-combined-ca-bundle\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.399103 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vj9m\" (UniqueName: \"kubernetes.io/projected/2d4633f8-1f68-4d44-8569-69f02e1886f3-kube-api-access-4vj9m\") pod \"ovn-controller-metrics-tvbtp\" (UID: \"2d4633f8-1f68-4d44-8569-69f02e1886f3\") " pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.473692 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97fd37f3-ee6b-4a37-a32b-057d21edf416-scripts\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.473733 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcsvr\" (UniqueName: \"kubernetes.io/projected/050b7c88-3e41-445b-af12-c14a4cb66736-kube-api-access-mcsvr\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.473804 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fd37f3-ee6b-4a37-a32b-057d21edf416-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.473831 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.473853 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-config\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.473881 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/97fd37f3-ee6b-4a37-a32b-057d21edf416-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.473901 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cf2sv\" (UniqueName: \"kubernetes.io/projected/97fd37f3-ee6b-4a37-a32b-057d21edf416-kube-api-access-cf2sv\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.473928 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fd37f3-ee6b-4a37-a32b-057d21edf416-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.473947 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.473979 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-dns-svc\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.474008 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97fd37f3-ee6b-4a37-a32b-057d21edf416-config\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.474022 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fd37f3-ee6b-4a37-a32b-057d21edf416-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.474923 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/97fd37f3-ee6b-4a37-a32b-057d21edf416-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.475602 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/97fd37f3-ee6b-4a37-a32b-057d21edf416-scripts\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.476161 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-ovsdbserver-nb\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.476987 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97fd37f3-ee6b-4a37-a32b-057d21edf416-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.477555 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-dns-svc\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.478551 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-ovsdbserver-sb\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.478590 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97fd37f3-ee6b-4a37-a32b-057d21edf416-config\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.479118 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-config\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.479845 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fd37f3-ee6b-4a37-a32b-057d21edf416-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.486764 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/97fd37f3-ee6b-4a37-a32b-057d21edf416-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.491510 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cf2sv\" (UniqueName: \"kubernetes.io/projected/97fd37f3-ee6b-4a37-a32b-057d21edf416-kube-api-access-cf2sv\") pod \"ovn-northd-0\" (UID: \"97fd37f3-ee6b-4a37-a32b-057d21edf416\") " pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.493150 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-tvbtp" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.497244 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcsvr\" (UniqueName: \"kubernetes.io/projected/050b7c88-3e41-445b-af12-c14a4cb66736-kube-api-access-mcsvr\") pod \"dnsmasq-dns-6b6dd89997-vdvds\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.525810 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.635257 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.641204 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.647902 4974 generic.go:334] "Generic (PLEG): container finished" podID="cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" containerID="8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b" exitCode=0 Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.647967 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.647970 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" event={"ID":"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0","Type":"ContainerDied","Data":"8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b"} Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.648086 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69f8f5886f-tmcvm" event={"ID":"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0","Type":"ContainerDied","Data":"81f888f71e23daf5644b64e52a8720c60c84a0f045f0166df64249934f9b6a79"} Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.648109 4974 scope.go:117] "RemoveContainer" containerID="8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.651235 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"72dd5704-4623-4186-8914-512c4ea61a5b","Type":"ContainerStarted","Data":"127ec43787a83a47abd20db5c769375861d4107b13017770e7fac09120e460a5"} Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.677066 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-dns-svc\") pod \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\" (UID: \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\") " Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.677271 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-config\") pod \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\" (UID: \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\") " Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.677457 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x58sp\" (UniqueName: \"kubernetes.io/projected/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-kube-api-access-x58sp\") pod \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\" (UID: \"cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0\") " Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.679578 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.97711106 podStartE2EDuration="29.679562682s" podCreationTimestamp="2025-10-13 18:29:20 +0000 UTC" firstStartedPulling="2025-10-13 18:29:32.577889938 +0000 UTC m=+907.482256018" lastFinishedPulling="2025-10-13 18:29:42.28034153 +0000 UTC m=+917.184707640" observedRunningTime="2025-10-13 18:29:49.67450605 +0000 UTC m=+924.578872140" watchObservedRunningTime="2025-10-13 18:29:49.679562682 +0000 UTC m=+924.583928762" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.690850 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-kube-api-access-x58sp" (OuterVolumeSpecName: "kube-api-access-x58sp") pod "cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" (UID: "cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0"). InnerVolumeSpecName "kube-api-access-x58sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.707057 4974 scope.go:117] "RemoveContainer" containerID="8370d4b131a607a2521882cc92d66028179b1ddd55d727e7164ede6ab958614a" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.730026 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" (UID: "cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.742082 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-config" (OuterVolumeSpecName: "config") pod "cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" (UID: "cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.770161 4974 scope.go:117] "RemoveContainer" containerID="8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b" Oct 13 18:29:49 crc kubenswrapper[4974]: E1013 18:29:49.774109 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b\": container with ID starting with 8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b not found: ID does not exist" containerID="8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.774154 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b"} err="failed to get container status \"8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b\": rpc error: code = NotFound desc = could not find container \"8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b\": container with ID starting with 8789416f0ec8416ba1bb95fc968f5e49b43a1df3cb1fc52f7264e9e009d3f57b not found: ID does not exist" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.774186 4974 scope.go:117] "RemoveContainer" containerID="8370d4b131a607a2521882cc92d66028179b1ddd55d727e7164ede6ab958614a" Oct 13 18:29:49 crc kubenswrapper[4974]: E1013 18:29:49.776316 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8370d4b131a607a2521882cc92d66028179b1ddd55d727e7164ede6ab958614a\": container with ID starting with 8370d4b131a607a2521882cc92d66028179b1ddd55d727e7164ede6ab958614a not found: ID does not exist" containerID="8370d4b131a607a2521882cc92d66028179b1ddd55d727e7164ede6ab958614a" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.776359 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8370d4b131a607a2521882cc92d66028179b1ddd55d727e7164ede6ab958614a"} err="failed to get container status \"8370d4b131a607a2521882cc92d66028179b1ddd55d727e7164ede6ab958614a\": rpc error: code = NotFound desc = could not find container \"8370d4b131a607a2521882cc92d66028179b1ddd55d727e7164ede6ab958614a\": container with ID starting with 8370d4b131a607a2521882cc92d66028179b1ddd55d727e7164ede6ab958614a not found: ID does not exist" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.793056 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x58sp\" (UniqueName: \"kubernetes.io/projected/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-kube-api-access-x58sp\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.793674 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.793876 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.809264 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b748cc897-lvqfd"] Oct 13 18:29:49 crc kubenswrapper[4974]: I1013 18:29:49.954029 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-tvbtp"] Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.002348 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69f8f5886f-tmcvm"] Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.013207 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69f8f5886f-tmcvm"] Oct 13 18:29:50 crc kubenswrapper[4974]: E1013 18:29:50.020248 4974 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbb5a5f0_3887_4c04_9a74_bba1bcf4b9e0.slice\": RecentStats: unable to find data in memory cache]" Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.156536 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b6dd89997-vdvds"] Oct 13 18:29:50 crc kubenswrapper[4974]: W1013 18:29:50.162814 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod050b7c88_3e41_445b_af12_c14a4cb66736.slice/crio-1c6c212c3cb0750d4195a956641fcbd58b62959559895f5d7fa40ef9008cd173 WatchSource:0}: Error finding container 1c6c212c3cb0750d4195a956641fcbd58b62959559895f5d7fa40ef9008cd173: Status 404 returned error can't find the container with id 1c6c212c3cb0750d4195a956641fcbd58b62959559895f5d7fa40ef9008cd173 Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.315812 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Oct 13 18:29:50 crc kubenswrapper[4974]: W1013 18:29:50.356269 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97fd37f3_ee6b_4a37_a32b_057d21edf416.slice/crio-838225408dd73a66fa75e54c2719bde610f6296219f65df37173a68478e18451 WatchSource:0}: Error finding container 838225408dd73a66fa75e54c2719bde610f6296219f65df37173a68478e18451: Status 404 returned error can't find the container with id 838225408dd73a66fa75e54c2719bde610f6296219f65df37173a68478e18451 Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.672696 4974 generic.go:334] "Generic (PLEG): container finished" podID="050b7c88-3e41-445b-af12-c14a4cb66736" containerID="1cb2e5f102aa25ace5b955577706ef587ff58eee9643984cfaad52737b9e78e8" exitCode=0 Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.672795 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" event={"ID":"050b7c88-3e41-445b-af12-c14a4cb66736","Type":"ContainerDied","Data":"1cb2e5f102aa25ace5b955577706ef587ff58eee9643984cfaad52737b9e78e8"} Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.672848 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" event={"ID":"050b7c88-3e41-445b-af12-c14a4cb66736","Type":"ContainerStarted","Data":"1c6c212c3cb0750d4195a956641fcbd58b62959559895f5d7fa40ef9008cd173"} Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.675434 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"97fd37f3-ee6b-4a37-a32b-057d21edf416","Type":"ContainerStarted","Data":"838225408dd73a66fa75e54c2719bde610f6296219f65df37173a68478e18451"} Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.682599 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tvbtp" event={"ID":"2d4633f8-1f68-4d44-8569-69f02e1886f3","Type":"ContainerStarted","Data":"29d01e627e31313d2069b6e1cf18a597b87d7002d14871334d1bc1155b659013"} Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.683244 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-tvbtp" event={"ID":"2d4633f8-1f68-4d44-8569-69f02e1886f3","Type":"ContainerStarted","Data":"2fcab0620bdb246ce1362f49346c4100d3750516a37697046eb695a5905c052f"} Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.685113 4974 generic.go:334] "Generic (PLEG): container finished" podID="bca920cc-b99a-42e1-b738-24e1294c9d87" containerID="76984537c52b42bbdaa7fdac7e576017bc6f59ea5d775345d527029cc50b8d00" exitCode=0 Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.685313 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" event={"ID":"bca920cc-b99a-42e1-b738-24e1294c9d87","Type":"ContainerDied","Data":"76984537c52b42bbdaa7fdac7e576017bc6f59ea5d775345d527029cc50b8d00"} Oct 13 18:29:50 crc kubenswrapper[4974]: I1013 18:29:50.685363 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" event={"ID":"bca920cc-b99a-42e1-b738-24e1294c9d87","Type":"ContainerStarted","Data":"66acdf325b5894ce0728e6b9dd938b5f8335e4f547f9b0800c5df99b881161fd"} Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.166230 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.193764 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-tvbtp" podStartSLOduration=2.193742589 podStartE2EDuration="2.193742589s" podCreationTimestamp="2025-10-13 18:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:29:50.731085203 +0000 UTC m=+925.635451283" watchObservedRunningTime="2025-10-13 18:29:51.193742589 +0000 UTC m=+926.098108679" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.216989 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-ovsdbserver-nb\") pod \"bca920cc-b99a-42e1-b738-24e1294c9d87\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.217413 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-config\") pod \"bca920cc-b99a-42e1-b738-24e1294c9d87\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.217547 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9ct2\" (UniqueName: \"kubernetes.io/projected/bca920cc-b99a-42e1-b738-24e1294c9d87-kube-api-access-h9ct2\") pod \"bca920cc-b99a-42e1-b738-24e1294c9d87\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.217953 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-dns-svc\") pod \"bca920cc-b99a-42e1-b738-24e1294c9d87\" (UID: \"bca920cc-b99a-42e1-b738-24e1294c9d87\") " Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.222819 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bca920cc-b99a-42e1-b738-24e1294c9d87-kube-api-access-h9ct2" (OuterVolumeSpecName: "kube-api-access-h9ct2") pod "bca920cc-b99a-42e1-b738-24e1294c9d87" (UID: "bca920cc-b99a-42e1-b738-24e1294c9d87"). InnerVolumeSpecName "kube-api-access-h9ct2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.239688 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bca920cc-b99a-42e1-b738-24e1294c9d87" (UID: "bca920cc-b99a-42e1-b738-24e1294c9d87"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.243502 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-config" (OuterVolumeSpecName: "config") pod "bca920cc-b99a-42e1-b738-24e1294c9d87" (UID: "bca920cc-b99a-42e1-b738-24e1294c9d87"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.245309 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bca920cc-b99a-42e1-b738-24e1294c9d87" (UID: "bca920cc-b99a-42e1-b738-24e1294c9d87"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.320473 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.321956 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9ct2\" (UniqueName: \"kubernetes.io/projected/bca920cc-b99a-42e1-b738-24e1294c9d87-kube-api-access-h9ct2\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.322000 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.322273 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bca920cc-b99a-42e1-b738-24e1294c9d87-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.696294 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" event={"ID":"050b7c88-3e41-445b-af12-c14a4cb66736","Type":"ContainerStarted","Data":"edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b"} Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.696850 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.699642 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"97fd37f3-ee6b-4a37-a32b-057d21edf416","Type":"ContainerStarted","Data":"370540c5efd242a541b59b40568df96aeae69fdde48ed10cb2290d78016cc465"} Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.699731 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"97fd37f3-ee6b-4a37-a32b-057d21edf416","Type":"ContainerStarted","Data":"df8a32e6aa9ee9b42e1f890cbfd68270cf80fb183cbd0bc3e2c3c26a4625559a"} Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.699760 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.701768 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" event={"ID":"bca920cc-b99a-42e1-b738-24e1294c9d87","Type":"ContainerDied","Data":"66acdf325b5894ce0728e6b9dd938b5f8335e4f547f9b0800c5df99b881161fd"} Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.701831 4974 scope.go:117] "RemoveContainer" containerID="76984537c52b42bbdaa7fdac7e576017bc6f59ea5d775345d527029cc50b8d00" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.701893 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b748cc897-lvqfd" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.723211 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" podStartSLOduration=2.723182218 podStartE2EDuration="2.723182218s" podCreationTimestamp="2025-10-13 18:29:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:29:51.717502767 +0000 UTC m=+926.621868847" watchObservedRunningTime="2025-10-13 18:29:51.723182218 +0000 UTC m=+926.627548338" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.747009 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.909636267 podStartE2EDuration="2.746970438s" podCreationTimestamp="2025-10-13 18:29:49 +0000 UTC" firstStartedPulling="2025-10-13 18:29:50.359094454 +0000 UTC m=+925.263460544" lastFinishedPulling="2025-10-13 18:29:51.196428635 +0000 UTC m=+926.100794715" observedRunningTime="2025-10-13 18:29:51.743996455 +0000 UTC m=+926.648362545" watchObservedRunningTime="2025-10-13 18:29:51.746970438 +0000 UTC m=+926.651336538" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.802049 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b748cc897-lvqfd"] Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.807264 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b748cc897-lvqfd"] Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.825148 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bca920cc-b99a-42e1-b738-24e1294c9d87" path="/var/lib/kubelet/pods/bca920cc-b99a-42e1-b738-24e1294c9d87/volumes" Oct 13 18:29:51 crc kubenswrapper[4974]: I1013 18:29:51.826929 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" path="/var/lib/kubelet/pods/cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0/volumes" Oct 13 18:29:52 crc kubenswrapper[4974]: I1013 18:29:52.154937 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Oct 13 18:29:52 crc kubenswrapper[4974]: I1013 18:29:52.154985 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Oct 13 18:29:53 crc kubenswrapper[4974]: I1013 18:29:53.554935 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:53 crc kubenswrapper[4974]: I1013 18:29:53.555296 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:53 crc kubenswrapper[4974]: I1013 18:29:53.635161 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:53 crc kubenswrapper[4974]: I1013 18:29:53.829258 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Oct 13 18:29:54 crc kubenswrapper[4974]: I1013 18:29:54.750271 4974 generic.go:334] "Generic (PLEG): container finished" podID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerID="fef82051b786278429ce9f99d7d6776b6f30b36730555d1be52d8d010a0455d8" exitCode=0 Oct 13 18:29:54 crc kubenswrapper[4974]: I1013 18:29:54.750378 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9981a58f-bf58-44c5-9da9-fbe0ef9005a9","Type":"ContainerDied","Data":"fef82051b786278429ce9f99d7d6776b6f30b36730555d1be52d8d010a0455d8"} Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.585432 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.740919 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b6dd89997-vdvds"] Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.741584 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" podUID="050b7c88-3e41-445b-af12-c14a4cb66736" containerName="dnsmasq-dns" containerID="cri-o://edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b" gracePeriod=10 Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.792393 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-65cff4f877-2mckm"] Oct 13 18:29:55 crc kubenswrapper[4974]: E1013 18:29:55.792763 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" containerName="init" Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.792776 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" containerName="init" Oct 13 18:29:55 crc kubenswrapper[4974]: E1013 18:29:55.792793 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" containerName="dnsmasq-dns" Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.792799 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" containerName="dnsmasq-dns" Oct 13 18:29:55 crc kubenswrapper[4974]: E1013 18:29:55.792816 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bca920cc-b99a-42e1-b738-24e1294c9d87" containerName="init" Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.792822 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="bca920cc-b99a-42e1-b738-24e1294c9d87" containerName="init" Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.792997 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbb5a5f0-3887-4c04-9a74-bba1bcf4b9e0" containerName="dnsmasq-dns" Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.793077 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="bca920cc-b99a-42e1-b738-24e1294c9d87" containerName="init" Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.794057 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.848338 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65cff4f877-2mckm"] Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.909479 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-config\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.909534 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-ovsdbserver-nb\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.909595 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5spjq\" (UniqueName: \"kubernetes.io/projected/37628139-a9e5-4396-a102-b1cc1de2fd2e-kube-api-access-5spjq\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.909609 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-dns-svc\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:55 crc kubenswrapper[4974]: I1013 18:29:55.909664 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-ovsdbserver-sb\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.011239 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-config\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.011288 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-ovsdbserver-nb\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.011339 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-dns-svc\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.011356 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5spjq\" (UniqueName: \"kubernetes.io/projected/37628139-a9e5-4396-a102-b1cc1de2fd2e-kube-api-access-5spjq\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.011404 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-ovsdbserver-sb\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.012194 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-config\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.012201 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-ovsdbserver-sb\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.012263 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-ovsdbserver-nb\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.013961 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-dns-svc\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.036046 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5spjq\" (UniqueName: \"kubernetes.io/projected/37628139-a9e5-4396-a102-b1cc1de2fd2e-kube-api-access-5spjq\") pod \"dnsmasq-dns-65cff4f877-2mckm\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.148796 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.249230 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.287417 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.358299 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.418022 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-ovsdbserver-sb\") pod \"050b7c88-3e41-445b-af12-c14a4cb66736\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.418320 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-dns-svc\") pod \"050b7c88-3e41-445b-af12-c14a4cb66736\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.418354 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-ovsdbserver-nb\") pod \"050b7c88-3e41-445b-af12-c14a4cb66736\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.418402 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-config\") pod \"050b7c88-3e41-445b-af12-c14a4cb66736\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.418464 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcsvr\" (UniqueName: \"kubernetes.io/projected/050b7c88-3e41-445b-af12-c14a4cb66736-kube-api-access-mcsvr\") pod \"050b7c88-3e41-445b-af12-c14a4cb66736\" (UID: \"050b7c88-3e41-445b-af12-c14a4cb66736\") " Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.426736 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/050b7c88-3e41-445b-af12-c14a4cb66736-kube-api-access-mcsvr" (OuterVolumeSpecName: "kube-api-access-mcsvr") pod "050b7c88-3e41-445b-af12-c14a4cb66736" (UID: "050b7c88-3e41-445b-af12-c14a4cb66736"). InnerVolumeSpecName "kube-api-access-mcsvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.477039 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "050b7c88-3e41-445b-af12-c14a4cb66736" (UID: "050b7c88-3e41-445b-af12-c14a4cb66736"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.482047 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "050b7c88-3e41-445b-af12-c14a4cb66736" (UID: "050b7c88-3e41-445b-af12-c14a4cb66736"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.482968 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-config" (OuterVolumeSpecName: "config") pod "050b7c88-3e41-445b-af12-c14a4cb66736" (UID: "050b7c88-3e41-445b-af12-c14a4cb66736"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.488690 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "050b7c88-3e41-445b-af12-c14a4cb66736" (UID: "050b7c88-3e41-445b-af12-c14a4cb66736"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.525579 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.525626 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.525637 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.525662 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcsvr\" (UniqueName: \"kubernetes.io/projected/050b7c88-3e41-445b-af12-c14a4cb66736-kube-api-access-mcsvr\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.525671 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/050b7c88-3e41-445b-af12-c14a4cb66736-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.655833 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-65cff4f877-2mckm"] Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.784600 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Oct 13 18:29:56 crc kubenswrapper[4974]: E1013 18:29:56.784930 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="050b7c88-3e41-445b-af12-c14a4cb66736" containerName="init" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.784950 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="050b7c88-3e41-445b-af12-c14a4cb66736" containerName="init" Oct 13 18:29:56 crc kubenswrapper[4974]: E1013 18:29:56.784965 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="050b7c88-3e41-445b-af12-c14a4cb66736" containerName="dnsmasq-dns" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.784972 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="050b7c88-3e41-445b-af12-c14a4cb66736" containerName="dnsmasq-dns" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.785166 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="050b7c88-3e41-445b-af12-c14a4cb66736" containerName="dnsmasq-dns" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.787281 4974 generic.go:334] "Generic (PLEG): container finished" podID="050b7c88-3e41-445b-af12-c14a4cb66736" containerID="edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b" exitCode=0 Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.787370 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.790338 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" event={"ID":"050b7c88-3e41-445b-af12-c14a4cb66736","Type":"ContainerDied","Data":"edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b"} Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.790379 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b6dd89997-vdvds" event={"ID":"050b7c88-3e41-445b-af12-c14a4cb66736","Type":"ContainerDied","Data":"1c6c212c3cb0750d4195a956641fcbd58b62959559895f5d7fa40ef9008cd173"} Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.790400 4974 scope.go:117] "RemoveContainer" containerID="edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.790549 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.792473 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" event={"ID":"37628139-a9e5-4396-a102-b1cc1de2fd2e","Type":"ContainerStarted","Data":"f820462a93b0f6aae1c61e759d41d2b18815e8aba4a9940827524d75cc9e7182"} Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.797014 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.797149 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.797314 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-rfh25" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.799706 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.807934 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.845430 4974 scope.go:117] "RemoveContainer" containerID="1cb2e5f102aa25ace5b955577706ef587ff58eee9643984cfaad52737b9e78e8" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.845533 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b6dd89997-vdvds"] Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.853195 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b6dd89997-vdvds"] Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.882994 4974 scope.go:117] "RemoveContainer" containerID="edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b" Oct 13 18:29:56 crc kubenswrapper[4974]: E1013 18:29:56.884704 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b\": container with ID starting with edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b not found: ID does not exist" containerID="edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.884747 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b"} err="failed to get container status \"edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b\": rpc error: code = NotFound desc = could not find container \"edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b\": container with ID starting with edb6b47c9f4dbe8a14ead6f7210c196617cb85597622993cb56f8c31964bbc3b not found: ID does not exist" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.884775 4974 scope.go:117] "RemoveContainer" containerID="1cb2e5f102aa25ace5b955577706ef587ff58eee9643984cfaad52737b9e78e8" Oct 13 18:29:56 crc kubenswrapper[4974]: E1013 18:29:56.885091 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb2e5f102aa25ace5b955577706ef587ff58eee9643984cfaad52737b9e78e8\": container with ID starting with 1cb2e5f102aa25ace5b955577706ef587ff58eee9643984cfaad52737b9e78e8 not found: ID does not exist" containerID="1cb2e5f102aa25ace5b955577706ef587ff58eee9643984cfaad52737b9e78e8" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.885114 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb2e5f102aa25ace5b955577706ef587ff58eee9643984cfaad52737b9e78e8"} err="failed to get container status \"1cb2e5f102aa25ace5b955577706ef587ff58eee9643984cfaad52737b9e78e8\": rpc error: code = NotFound desc = could not find container \"1cb2e5f102aa25ace5b955577706ef587ff58eee9643984cfaad52737b9e78e8\": container with ID starting with 1cb2e5f102aa25ace5b955577706ef587ff58eee9643984cfaad52737b9e78e8 not found: ID does not exist" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.931531 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.931626 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d7e32e29-6d51-4230-b7d5-911b0787a900-cache\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.931696 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.931766 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d7e32e29-6d51-4230-b7d5-911b0787a900-lock\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:56 crc kubenswrapper[4974]: I1013 18:29:56.931897 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqchh\" (UniqueName: \"kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-kube-api-access-gqchh\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.033237 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.033289 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d7e32e29-6d51-4230-b7d5-911b0787a900-lock\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.033366 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqchh\" (UniqueName: \"kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-kube-api-access-gqchh\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.033392 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.033428 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d7e32e29-6d51-4230-b7d5-911b0787a900-cache\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.033683 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/swift-storage-0" Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.033902 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/d7e32e29-6d51-4230-b7d5-911b0787a900-cache\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.034014 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/d7e32e29-6d51-4230-b7d5-911b0787a900-lock\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:57 crc kubenswrapper[4974]: E1013 18:29:57.034071 4974 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 13 18:29:57 crc kubenswrapper[4974]: E1013 18:29:57.034084 4974 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 13 18:29:57 crc kubenswrapper[4974]: E1013 18:29:57.034131 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift podName:d7e32e29-6d51-4230-b7d5-911b0787a900 nodeName:}" failed. No retries permitted until 2025-10-13 18:29:57.534116426 +0000 UTC m=+932.438482506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift") pod "swift-storage-0" (UID: "d7e32e29-6d51-4230-b7d5-911b0787a900") : configmap "swift-ring-files" not found Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.055064 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqchh\" (UniqueName: \"kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-kube-api-access-gqchh\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.057769 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.541849 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:57 crc kubenswrapper[4974]: E1013 18:29:57.542371 4974 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 13 18:29:57 crc kubenswrapper[4974]: E1013 18:29:57.542409 4974 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 13 18:29:57 crc kubenswrapper[4974]: E1013 18:29:57.542488 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift podName:d7e32e29-6d51-4230-b7d5-911b0787a900 nodeName:}" failed. No retries permitted until 2025-10-13 18:29:58.54247266 +0000 UTC m=+933.446838740 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift") pod "swift-storage-0" (UID: "d7e32e29-6d51-4230-b7d5-911b0787a900") : configmap "swift-ring-files" not found Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.803598 4974 generic.go:334] "Generic (PLEG): container finished" podID="37628139-a9e5-4396-a102-b1cc1de2fd2e" containerID="cdbdf537d5f0b6dc689516cf8aa79a990c66b7a5f1ba685b8b169f31c269fcc3" exitCode=0 Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.803661 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" event={"ID":"37628139-a9e5-4396-a102-b1cc1de2fd2e","Type":"ContainerDied","Data":"cdbdf537d5f0b6dc689516cf8aa79a990c66b7a5f1ba685b8b169f31c269fcc3"} Oct 13 18:29:57 crc kubenswrapper[4974]: I1013 18:29:57.846753 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="050b7c88-3e41-445b-af12-c14a4cb66736" path="/var/lib/kubelet/pods/050b7c88-3e41-445b-af12-c14a4cb66736/volumes" Oct 13 18:29:58 crc kubenswrapper[4974]: I1013 18:29:58.567931 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:29:58 crc kubenswrapper[4974]: E1013 18:29:58.568807 4974 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 13 18:29:58 crc kubenswrapper[4974]: E1013 18:29:58.568832 4974 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 13 18:29:58 crc kubenswrapper[4974]: E1013 18:29:58.568892 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift podName:d7e32e29-6d51-4230-b7d5-911b0787a900 nodeName:}" failed. No retries permitted until 2025-10-13 18:30:00.568872373 +0000 UTC m=+935.473238453 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift") pod "swift-storage-0" (UID: "d7e32e29-6d51-4230-b7d5-911b0787a900") : configmap "swift-ring-files" not found Oct 13 18:29:58 crc kubenswrapper[4974]: I1013 18:29:58.815058 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" event={"ID":"37628139-a9e5-4396-a102-b1cc1de2fd2e","Type":"ContainerStarted","Data":"13f558a91965142fefcd5b7de36b3049dbe860968e2e34d0ef1395f653ac60dc"} Oct 13 18:29:58 crc kubenswrapper[4974]: I1013 18:29:58.815957 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:29:58 crc kubenswrapper[4974]: I1013 18:29:58.838392 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" podStartSLOduration=3.838373571 podStartE2EDuration="3.838373571s" podCreationTimestamp="2025-10-13 18:29:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:29:58.834093211 +0000 UTC m=+933.738459311" watchObservedRunningTime="2025-10-13 18:29:58.838373571 +0000 UTC m=+933.742739651" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.137804 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll"] Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.140431 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.142614 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.142877 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.149886 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll"] Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.294507 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43954b5e-c77a-4ef7-bc58-42fc4f98600b-secret-volume\") pod \"collect-profiles-29339670-f6vll\" (UID: \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.294557 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43954b5e-c77a-4ef7-bc58-42fc4f98600b-config-volume\") pod \"collect-profiles-29339670-f6vll\" (UID: \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.294724 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb8fp\" (UniqueName: \"kubernetes.io/projected/43954b5e-c77a-4ef7-bc58-42fc4f98600b-kube-api-access-fb8fp\") pod \"collect-profiles-29339670-f6vll\" (UID: \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.396211 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43954b5e-c77a-4ef7-bc58-42fc4f98600b-secret-volume\") pod \"collect-profiles-29339670-f6vll\" (UID: \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.396269 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43954b5e-c77a-4ef7-bc58-42fc4f98600b-config-volume\") pod \"collect-profiles-29339670-f6vll\" (UID: \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.396340 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fb8fp\" (UniqueName: \"kubernetes.io/projected/43954b5e-c77a-4ef7-bc58-42fc4f98600b-kube-api-access-fb8fp\") pod \"collect-profiles-29339670-f6vll\" (UID: \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.398589 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43954b5e-c77a-4ef7-bc58-42fc4f98600b-config-volume\") pod \"collect-profiles-29339670-f6vll\" (UID: \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.403348 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43954b5e-c77a-4ef7-bc58-42fc4f98600b-secret-volume\") pod \"collect-profiles-29339670-f6vll\" (UID: \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.412086 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb8fp\" (UniqueName: \"kubernetes.io/projected/43954b5e-c77a-4ef7-bc58-42fc4f98600b-kube-api-access-fb8fp\") pod \"collect-profiles-29339670-f6vll\" (UID: \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.478070 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.599060 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:30:00 crc kubenswrapper[4974]: E1013 18:30:00.599276 4974 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 13 18:30:00 crc kubenswrapper[4974]: E1013 18:30:00.599297 4974 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 13 18:30:00 crc kubenswrapper[4974]: E1013 18:30:00.599362 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift podName:d7e32e29-6d51-4230-b7d5-911b0787a900 nodeName:}" failed. No retries permitted until 2025-10-13 18:30:04.599336997 +0000 UTC m=+939.503703087 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift") pod "swift-storage-0" (UID: "d7e32e29-6d51-4230-b7d5-911b0787a900") : configmap "swift-ring-files" not found Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.789614 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-x86hr"] Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.798170 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.804379 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.804578 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.804706 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.827151 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-x86hr"] Oct 13 18:30:00 crc kubenswrapper[4974]: E1013 18:30:00.854437 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-ld6v5 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-ld6v5 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-x86hr" podUID="fe39c7a6-6f93-4b5b-8fba-046b737a249a" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.860006 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-x86hr"] Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.871043 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9981a58f-bf58-44c5-9da9-fbe0ef9005a9","Type":"ContainerStarted","Data":"b0aacf843ad242659bec4fa2c523ef04dbdaa3aab4e7148b88e9cfff2b3dfd45"} Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.871078 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-7phmq"] Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.872097 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.899107 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7phmq"] Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.903743 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe39c7a6-6f93-4b5b-8fba-046b737a249a-scripts\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.903797 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-swiftconf\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.903819 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-combined-ca-bundle\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.903837 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fe39c7a6-6f93-4b5b-8fba-046b737a249a-ring-data-devices\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.903905 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-dispersionconf\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.903943 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ld6v5\" (UniqueName: \"kubernetes.io/projected/fe39c7a6-6f93-4b5b-8fba-046b737a249a-kube-api-access-ld6v5\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:00 crc kubenswrapper[4974]: I1013 18:30:00.903987 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fe39c7a6-6f93-4b5b-8fba-046b737a249a-etc-swift\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.005417 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ld6v5\" (UniqueName: \"kubernetes.io/projected/fe39c7a6-6f93-4b5b-8fba-046b737a249a-kube-api-access-ld6v5\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.005747 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fe39c7a6-6f93-4b5b-8fba-046b737a249a-etc-swift\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.005888 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-combined-ca-bundle\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.006083 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-swiftconf\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.006222 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6aff3b1c-df57-4faf-9c6b-1009d5090a13-etc-swift\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.006296 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe39c7a6-6f93-4b5b-8fba-046b737a249a-scripts\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.006351 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-swiftconf\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.006430 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpd6m\" (UniqueName: \"kubernetes.io/projected/6aff3b1c-df57-4faf-9c6b-1009d5090a13-kube-api-access-wpd6m\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.006517 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-combined-ca-bundle\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.006559 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fe39c7a6-6f93-4b5b-8fba-046b737a249a-ring-data-devices\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.006639 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-dispersionconf\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.006721 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6aff3b1c-df57-4faf-9c6b-1009d5090a13-ring-data-devices\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.006779 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aff3b1c-df57-4faf-9c6b-1009d5090a13-scripts\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.006872 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-dispersionconf\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.006946 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fe39c7a6-6f93-4b5b-8fba-046b737a249a-etc-swift\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.008111 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe39c7a6-6f93-4b5b-8fba-046b737a249a-scripts\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.009921 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fe39c7a6-6f93-4b5b-8fba-046b737a249a-ring-data-devices\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.012831 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-swiftconf\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.013128 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-combined-ca-bundle\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.015106 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-dispersionconf\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.030219 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ld6v5\" (UniqueName: \"kubernetes.io/projected/fe39c7a6-6f93-4b5b-8fba-046b737a249a-kube-api-access-ld6v5\") pod \"swift-ring-rebalance-x86hr\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.068462 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll"] Oct 13 18:30:01 crc kubenswrapper[4974]: W1013 18:30:01.074972 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43954b5e_c77a_4ef7_bc58_42fc4f98600b.slice/crio-c066ff642ccdcc289406ba7c8bfa3df14de61aaab7b6b7c3edf1b270c9b7c484 WatchSource:0}: Error finding container c066ff642ccdcc289406ba7c8bfa3df14de61aaab7b6b7c3edf1b270c9b7c484: Status 404 returned error can't find the container with id c066ff642ccdcc289406ba7c8bfa3df14de61aaab7b6b7c3edf1b270c9b7c484 Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.107935 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpd6m\" (UniqueName: \"kubernetes.io/projected/6aff3b1c-df57-4faf-9c6b-1009d5090a13-kube-api-access-wpd6m\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.107990 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-dispersionconf\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.108011 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6aff3b1c-df57-4faf-9c6b-1009d5090a13-ring-data-devices\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.108035 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aff3b1c-df57-4faf-9c6b-1009d5090a13-scripts\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.108110 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-combined-ca-bundle\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.108140 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-swiftconf\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.108177 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6aff3b1c-df57-4faf-9c6b-1009d5090a13-etc-swift\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.108687 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6aff3b1c-df57-4faf-9c6b-1009d5090a13-etc-swift\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.109347 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6aff3b1c-df57-4faf-9c6b-1009d5090a13-ring-data-devices\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.110107 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aff3b1c-df57-4faf-9c6b-1009d5090a13-scripts\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.112770 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-dispersionconf\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.113159 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-swiftconf\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.113614 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-combined-ca-bundle\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.131596 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpd6m\" (UniqueName: \"kubernetes.io/projected/6aff3b1c-df57-4faf-9c6b-1009d5090a13-kube-api-access-wpd6m\") pod \"swift-ring-rebalance-7phmq\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.206973 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.638431 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-7phmq"] Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.880005 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7phmq" event={"ID":"6aff3b1c-df57-4faf-9c6b-1009d5090a13","Type":"ContainerStarted","Data":"4aee309085826944e62400e6d521afbdc9102a6b0482704cb99c7d6546efd5eb"} Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.882573 4974 generic.go:334] "Generic (PLEG): container finished" podID="43954b5e-c77a-4ef7-bc58-42fc4f98600b" containerID="0e6dfd0bcd19d62a6ae1cced2370558f6aa13aedb14c46a80038429e82cca7e3" exitCode=0 Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.882710 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.883090 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" event={"ID":"43954b5e-c77a-4ef7-bc58-42fc4f98600b","Type":"ContainerDied","Data":"0e6dfd0bcd19d62a6ae1cced2370558f6aa13aedb14c46a80038429e82cca7e3"} Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.883156 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" event={"ID":"43954b5e-c77a-4ef7-bc58-42fc4f98600b","Type":"ContainerStarted","Data":"c066ff642ccdcc289406ba7c8bfa3df14de61aaab7b6b7c3edf1b270c9b7c484"} Oct 13 18:30:01 crc kubenswrapper[4974]: I1013 18:30:01.925547 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.025523 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fe39c7a6-6f93-4b5b-8fba-046b737a249a-ring-data-devices\") pod \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.025583 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe39c7a6-6f93-4b5b-8fba-046b737a249a-scripts\") pod \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.025729 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-combined-ca-bundle\") pod \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.025785 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fe39c7a6-6f93-4b5b-8fba-046b737a249a-etc-swift\") pod \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.025849 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-swiftconf\") pod \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.025875 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-dispersionconf\") pod \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.025933 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ld6v5\" (UniqueName: \"kubernetes.io/projected/fe39c7a6-6f93-4b5b-8fba-046b737a249a-kube-api-access-ld6v5\") pod \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\" (UID: \"fe39c7a6-6f93-4b5b-8fba-046b737a249a\") " Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.026107 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe39c7a6-6f93-4b5b-8fba-046b737a249a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fe39c7a6-6f93-4b5b-8fba-046b737a249a" (UID: "fe39c7a6-6f93-4b5b-8fba-046b737a249a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.026338 4974 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fe39c7a6-6f93-4b5b-8fba-046b737a249a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.026401 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fe39c7a6-6f93-4b5b-8fba-046b737a249a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fe39c7a6-6f93-4b5b-8fba-046b737a249a" (UID: "fe39c7a6-6f93-4b5b-8fba-046b737a249a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.026461 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe39c7a6-6f93-4b5b-8fba-046b737a249a-scripts" (OuterVolumeSpecName: "scripts") pod "fe39c7a6-6f93-4b5b-8fba-046b737a249a" (UID: "fe39c7a6-6f93-4b5b-8fba-046b737a249a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.030829 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fe39c7a6-6f93-4b5b-8fba-046b737a249a" (UID: "fe39c7a6-6f93-4b5b-8fba-046b737a249a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.030892 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fe39c7a6-6f93-4b5b-8fba-046b737a249a" (UID: "fe39c7a6-6f93-4b5b-8fba-046b737a249a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.031472 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fe39c7a6-6f93-4b5b-8fba-046b737a249a" (UID: "fe39c7a6-6f93-4b5b-8fba-046b737a249a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.036949 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe39c7a6-6f93-4b5b-8fba-046b737a249a-kube-api-access-ld6v5" (OuterVolumeSpecName: "kube-api-access-ld6v5") pod "fe39c7a6-6f93-4b5b-8fba-046b737a249a" (UID: "fe39c7a6-6f93-4b5b-8fba-046b737a249a"). InnerVolumeSpecName "kube-api-access-ld6v5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.127720 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.127752 4974 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fe39c7a6-6f93-4b5b-8fba-046b737a249a-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.127761 4974 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.127769 4974 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fe39c7a6-6f93-4b5b-8fba-046b737a249a-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.127780 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ld6v5\" (UniqueName: \"kubernetes.io/projected/fe39c7a6-6f93-4b5b-8fba-046b737a249a-kube-api-access-ld6v5\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.127822 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fe39c7a6-6f93-4b5b-8fba-046b737a249a-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.890754 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-x86hr" Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.946970 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-x86hr"] Oct 13 18:30:02 crc kubenswrapper[4974]: I1013 18:30:02.956127 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-x86hr"] Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.435866 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-mb5k8"] Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.438075 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mb5k8" Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.445523 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mb5k8"] Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.551730 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnq5h\" (UniqueName: \"kubernetes.io/projected/c4ff600a-6c11-4019-93a5-efe02ce706ab-kube-api-access-xnq5h\") pod \"keystone-db-create-mb5k8\" (UID: \"c4ff600a-6c11-4019-93a5-efe02ce706ab\") " pod="openstack/keystone-db-create-mb5k8" Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.654046 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnq5h\" (UniqueName: \"kubernetes.io/projected/c4ff600a-6c11-4019-93a5-efe02ce706ab-kube-api-access-xnq5h\") pod \"keystone-db-create-mb5k8\" (UID: \"c4ff600a-6c11-4019-93a5-efe02ce706ab\") " pod="openstack/keystone-db-create-mb5k8" Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.676992 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-jmszb"] Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.678034 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jmszb" Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.695409 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnq5h\" (UniqueName: \"kubernetes.io/projected/c4ff600a-6c11-4019-93a5-efe02ce706ab-kube-api-access-xnq5h\") pod \"keystone-db-create-mb5k8\" (UID: \"c4ff600a-6c11-4019-93a5-efe02ce706ab\") " pod="openstack/keystone-db-create-mb5k8" Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.702421 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jmszb"] Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.756472 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jbjp\" (UniqueName: \"kubernetes.io/projected/feef59e0-044c-4c9b-b496-44c30b067129-kube-api-access-7jbjp\") pod \"placement-db-create-jmszb\" (UID: \"feef59e0-044c-4c9b-b496-44c30b067129\") " pod="openstack/placement-db-create-jmszb" Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.766872 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mb5k8" Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.837241 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe39c7a6-6f93-4b5b-8fba-046b737a249a" path="/var/lib/kubelet/pods/fe39c7a6-6f93-4b5b-8fba-046b737a249a/volumes" Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.858353 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jbjp\" (UniqueName: \"kubernetes.io/projected/feef59e0-044c-4c9b-b496-44c30b067129-kube-api-access-7jbjp\") pod \"placement-db-create-jmszb\" (UID: \"feef59e0-044c-4c9b-b496-44c30b067129\") " pod="openstack/placement-db-create-jmszb" Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.893696 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jbjp\" (UniqueName: \"kubernetes.io/projected/feef59e0-044c-4c9b-b496-44c30b067129-kube-api-access-7jbjp\") pod \"placement-db-create-jmszb\" (UID: \"feef59e0-044c-4c9b-b496-44c30b067129\") " pod="openstack/placement-db-create-jmszb" Oct 13 18:30:03 crc kubenswrapper[4974]: I1013 18:30:03.903113 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9981a58f-bf58-44c5-9da9-fbe0ef9005a9","Type":"ContainerStarted","Data":"30e5efccd116a18e824bf0a06f3202193f9a04cc1e702e3c59c8e64391f9d065"} Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.042110 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jmszb" Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.179152 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.265860 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43954b5e-c77a-4ef7-bc58-42fc4f98600b-config-volume\") pod \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\" (UID: \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\") " Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.265957 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43954b5e-c77a-4ef7-bc58-42fc4f98600b-secret-volume\") pod \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\" (UID: \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\") " Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.265988 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fb8fp\" (UniqueName: \"kubernetes.io/projected/43954b5e-c77a-4ef7-bc58-42fc4f98600b-kube-api-access-fb8fp\") pod \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\" (UID: \"43954b5e-c77a-4ef7-bc58-42fc4f98600b\") " Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.266745 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43954b5e-c77a-4ef7-bc58-42fc4f98600b-config-volume" (OuterVolumeSpecName: "config-volume") pod "43954b5e-c77a-4ef7-bc58-42fc4f98600b" (UID: "43954b5e-c77a-4ef7-bc58-42fc4f98600b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.272032 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43954b5e-c77a-4ef7-bc58-42fc4f98600b-kube-api-access-fb8fp" (OuterVolumeSpecName: "kube-api-access-fb8fp") pod "43954b5e-c77a-4ef7-bc58-42fc4f98600b" (UID: "43954b5e-c77a-4ef7-bc58-42fc4f98600b"). InnerVolumeSpecName "kube-api-access-fb8fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.272051 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43954b5e-c77a-4ef7-bc58-42fc4f98600b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "43954b5e-c77a-4ef7-bc58-42fc4f98600b" (UID: "43954b5e-c77a-4ef7-bc58-42fc4f98600b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.368302 4974 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/43954b5e-c77a-4ef7-bc58-42fc4f98600b-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.368569 4974 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/43954b5e-c77a-4ef7-bc58-42fc4f98600b-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.368579 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fb8fp\" (UniqueName: \"kubernetes.io/projected/43954b5e-c77a-4ef7-bc58-42fc4f98600b-kube-api-access-fb8fp\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.528212 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-mb5k8"] Oct 13 18:30:04 crc kubenswrapper[4974]: W1013 18:30:04.530283 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4ff600a_6c11_4019_93a5_efe02ce706ab.slice/crio-8c0b90818ad3ec0eea80c985d550f61d182f38fef8393eb7a8c54871e4eb55a2 WatchSource:0}: Error finding container 8c0b90818ad3ec0eea80c985d550f61d182f38fef8393eb7a8c54871e4eb55a2: Status 404 returned error can't find the container with id 8c0b90818ad3ec0eea80c985d550f61d182f38fef8393eb7a8c54871e4eb55a2 Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.594988 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jmszb"] Oct 13 18:30:04 crc kubenswrapper[4974]: W1013 18:30:04.601054 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeef59e0_044c_4c9b_b496_44c30b067129.slice/crio-7673ba5a7e4626b518742b7e3c8b6d606eabb67b0375a62f90f53fedbb741b49 WatchSource:0}: Error finding container 7673ba5a7e4626b518742b7e3c8b6d606eabb67b0375a62f90f53fedbb741b49: Status 404 returned error can't find the container with id 7673ba5a7e4626b518742b7e3c8b6d606eabb67b0375a62f90f53fedbb741b49 Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.674830 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:30:04 crc kubenswrapper[4974]: E1013 18:30:04.675409 4974 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Oct 13 18:30:04 crc kubenswrapper[4974]: E1013 18:30:04.675449 4974 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Oct 13 18:30:04 crc kubenswrapper[4974]: E1013 18:30:04.675524 4974 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift podName:d7e32e29-6d51-4230-b7d5-911b0787a900 nodeName:}" failed. No retries permitted until 2025-10-13 18:30:12.675498517 +0000 UTC m=+947.579864607 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift") pod "swift-storage-0" (UID: "d7e32e29-6d51-4230-b7d5-911b0787a900") : configmap "swift-ring-files" not found Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.723732 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.914580 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jmszb" event={"ID":"feef59e0-044c-4c9b-b496-44c30b067129","Type":"ContainerStarted","Data":"1a36eeda5829317cf41e2a570a027aa0cd0a3ed05abb0a975bead2c2ac42018e"} Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.914978 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jmszb" event={"ID":"feef59e0-044c-4c9b-b496-44c30b067129","Type":"ContainerStarted","Data":"7673ba5a7e4626b518742b7e3c8b6d606eabb67b0375a62f90f53fedbb741b49"} Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.918260 4974 generic.go:334] "Generic (PLEG): container finished" podID="c4ff600a-6c11-4019-93a5-efe02ce706ab" containerID="48c02468a7a7d14686cf919f3fcb7c891bd4bf746c5701ec1d940a02a1527db1" exitCode=0 Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.918324 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mb5k8" event={"ID":"c4ff600a-6c11-4019-93a5-efe02ce706ab","Type":"ContainerDied","Data":"48c02468a7a7d14686cf919f3fcb7c891bd4bf746c5701ec1d940a02a1527db1"} Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.918346 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mb5k8" event={"ID":"c4ff600a-6c11-4019-93a5-efe02ce706ab","Type":"ContainerStarted","Data":"8c0b90818ad3ec0eea80c985d550f61d182f38fef8393eb7a8c54871e4eb55a2"} Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.922203 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7phmq" event={"ID":"6aff3b1c-df57-4faf-9c6b-1009d5090a13","Type":"ContainerStarted","Data":"7592af8d1ad393b28f87fcf6551d9bbf419b7d6fe1b462c422d58890b59ea53b"} Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.924065 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" event={"ID":"43954b5e-c77a-4ef7-bc58-42fc4f98600b","Type":"ContainerDied","Data":"c066ff642ccdcc289406ba7c8bfa3df14de61aaab7b6b7c3edf1b270c9b7c484"} Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.924093 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c066ff642ccdcc289406ba7c8bfa3df14de61aaab7b6b7c3edf1b270c9b7c484" Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.924143 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll" Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.942329 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-jmszb" podStartSLOduration=1.94230948 podStartE2EDuration="1.94230948s" podCreationTimestamp="2025-10-13 18:30:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:30:04.93519616 +0000 UTC m=+939.839562260" watchObservedRunningTime="2025-10-13 18:30:04.94230948 +0000 UTC m=+939.846675570" Oct 13 18:30:04 crc kubenswrapper[4974]: I1013 18:30:04.973314 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-7phmq" podStartSLOduration=2.535541864 podStartE2EDuration="4.973297764s" podCreationTimestamp="2025-10-13 18:30:00 +0000 UTC" firstStartedPulling="2025-10-13 18:30:01.646920267 +0000 UTC m=+936.551286377" lastFinishedPulling="2025-10-13 18:30:04.084676177 +0000 UTC m=+938.989042277" observedRunningTime="2025-10-13 18:30:04.966618256 +0000 UTC m=+939.870984356" watchObservedRunningTime="2025-10-13 18:30:04.973297764 +0000 UTC m=+939.877663844" Oct 13 18:30:05 crc kubenswrapper[4974]: I1013 18:30:05.514878 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-5z6xn"] Oct 13 18:30:05 crc kubenswrapper[4974]: E1013 18:30:05.515318 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43954b5e-c77a-4ef7-bc58-42fc4f98600b" containerName="collect-profiles" Oct 13 18:30:05 crc kubenswrapper[4974]: I1013 18:30:05.515342 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="43954b5e-c77a-4ef7-bc58-42fc4f98600b" containerName="collect-profiles" Oct 13 18:30:05 crc kubenswrapper[4974]: I1013 18:30:05.515584 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="43954b5e-c77a-4ef7-bc58-42fc4f98600b" containerName="collect-profiles" Oct 13 18:30:05 crc kubenswrapper[4974]: I1013 18:30:05.516297 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-5z6xn" Oct 13 18:30:05 crc kubenswrapper[4974]: I1013 18:30:05.526291 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-5z6xn"] Oct 13 18:30:05 crc kubenswrapper[4974]: I1013 18:30:05.591292 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvzgp\" (UniqueName: \"kubernetes.io/projected/534daef5-5335-4b53-ba13-980e53570cd7-kube-api-access-hvzgp\") pod \"watcher-db-create-5z6xn\" (UID: \"534daef5-5335-4b53-ba13-980e53570cd7\") " pod="openstack/watcher-db-create-5z6xn" Oct 13 18:30:05 crc kubenswrapper[4974]: I1013 18:30:05.693948 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvzgp\" (UniqueName: \"kubernetes.io/projected/534daef5-5335-4b53-ba13-980e53570cd7-kube-api-access-hvzgp\") pod \"watcher-db-create-5z6xn\" (UID: \"534daef5-5335-4b53-ba13-980e53570cd7\") " pod="openstack/watcher-db-create-5z6xn" Oct 13 18:30:05 crc kubenswrapper[4974]: I1013 18:30:05.723379 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvzgp\" (UniqueName: \"kubernetes.io/projected/534daef5-5335-4b53-ba13-980e53570cd7-kube-api-access-hvzgp\") pod \"watcher-db-create-5z6xn\" (UID: \"534daef5-5335-4b53-ba13-980e53570cd7\") " pod="openstack/watcher-db-create-5z6xn" Oct 13 18:30:05 crc kubenswrapper[4974]: I1013 18:30:05.837781 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-5z6xn" Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.009250 4974 generic.go:334] "Generic (PLEG): container finished" podID="feef59e0-044c-4c9b-b496-44c30b067129" containerID="1a36eeda5829317cf41e2a570a027aa0cd0a3ed05abb0a975bead2c2ac42018e" exitCode=0 Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.009730 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jmszb" event={"ID":"feef59e0-044c-4c9b-b496-44c30b067129","Type":"ContainerDied","Data":"1a36eeda5829317cf41e2a570a027aa0cd0a3ed05abb0a975bead2c2ac42018e"} Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.150850 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.242925 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f9c66c79-svxxk"] Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.243238 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" podUID="6887cfd9-80d7-43bc-bcb6-01b264f72d5a" containerName="dnsmasq-dns" containerID="cri-o://0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d" gracePeriod=10 Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.343445 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-5z6xn"] Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.443574 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mb5k8" Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.521994 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnq5h\" (UniqueName: \"kubernetes.io/projected/c4ff600a-6c11-4019-93a5-efe02ce706ab-kube-api-access-xnq5h\") pod \"c4ff600a-6c11-4019-93a5-efe02ce706ab\" (UID: \"c4ff600a-6c11-4019-93a5-efe02ce706ab\") " Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.529493 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4ff600a-6c11-4019-93a5-efe02ce706ab-kube-api-access-xnq5h" (OuterVolumeSpecName: "kube-api-access-xnq5h") pod "c4ff600a-6c11-4019-93a5-efe02ce706ab" (UID: "c4ff600a-6c11-4019-93a5-efe02ce706ab"). InnerVolumeSpecName "kube-api-access-xnq5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.622931 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnq5h\" (UniqueName: \"kubernetes.io/projected/c4ff600a-6c11-4019-93a5-efe02ce706ab-kube-api-access-xnq5h\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.771407 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.939078 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqjrb\" (UniqueName: \"kubernetes.io/projected/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-kube-api-access-rqjrb\") pod \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\" (UID: \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\") " Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.939187 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-config\") pod \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\" (UID: \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\") " Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.939285 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-dns-svc\") pod \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\" (UID: \"6887cfd9-80d7-43bc-bcb6-01b264f72d5a\") " Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.952870 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-kube-api-access-rqjrb" (OuterVolumeSpecName: "kube-api-access-rqjrb") pod "6887cfd9-80d7-43bc-bcb6-01b264f72d5a" (UID: "6887cfd9-80d7-43bc-bcb6-01b264f72d5a"). InnerVolumeSpecName "kube-api-access-rqjrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:06 crc kubenswrapper[4974]: I1013 18:30:06.979555 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-config" (OuterVolumeSpecName: "config") pod "6887cfd9-80d7-43bc-bcb6-01b264f72d5a" (UID: "6887cfd9-80d7-43bc-bcb6-01b264f72d5a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.006621 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6887cfd9-80d7-43bc-bcb6-01b264f72d5a" (UID: "6887cfd9-80d7-43bc-bcb6-01b264f72d5a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.020684 4974 generic.go:334] "Generic (PLEG): container finished" podID="6887cfd9-80d7-43bc-bcb6-01b264f72d5a" containerID="0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d" exitCode=0 Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.020816 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.022073 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" event={"ID":"6887cfd9-80d7-43bc-bcb6-01b264f72d5a","Type":"ContainerDied","Data":"0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d"} Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.022156 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79f9c66c79-svxxk" event={"ID":"6887cfd9-80d7-43bc-bcb6-01b264f72d5a","Type":"ContainerDied","Data":"4697d6348cc2253566f6dca8031f337d66078cdd9f7d8aad43a52da1d52576cf"} Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.022179 4974 scope.go:117] "RemoveContainer" containerID="0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.026721 4974 generic.go:334] "Generic (PLEG): container finished" podID="534daef5-5335-4b53-ba13-980e53570cd7" containerID="bb3167f020d19fea7d8b23c25b06afa20cc989e85a9cfa01936ee994b7a2a677" exitCode=0 Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.026773 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-5z6xn" event={"ID":"534daef5-5335-4b53-ba13-980e53570cd7","Type":"ContainerDied","Data":"bb3167f020d19fea7d8b23c25b06afa20cc989e85a9cfa01936ee994b7a2a677"} Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.026797 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-5z6xn" event={"ID":"534daef5-5335-4b53-ba13-980e53570cd7","Type":"ContainerStarted","Data":"31ad6f9f934046246aa78b03862989221e0477b29fcc9f2d44aa1f14ed0d9cb7"} Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.030123 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-mb5k8" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.030247 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-mb5k8" event={"ID":"c4ff600a-6c11-4019-93a5-efe02ce706ab","Type":"ContainerDied","Data":"8c0b90818ad3ec0eea80c985d550f61d182f38fef8393eb7a8c54871e4eb55a2"} Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.030265 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c0b90818ad3ec0eea80c985d550f61d182f38fef8393eb7a8c54871e4eb55a2" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.045127 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.045186 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqjrb\" (UniqueName: \"kubernetes.io/projected/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-kube-api-access-rqjrb\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.045214 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6887cfd9-80d7-43bc-bcb6-01b264f72d5a-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.074166 4974 scope.go:117] "RemoveContainer" containerID="975a605a596af28fdca8e13741a0c5c34b69945fdd304b6a7a5063aa3d8efe04" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.080485 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79f9c66c79-svxxk"] Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.089885 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79f9c66c79-svxxk"] Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.107226 4974 scope.go:117] "RemoveContainer" containerID="0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d" Oct 13 18:30:07 crc kubenswrapper[4974]: E1013 18:30:07.107586 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d\": container with ID starting with 0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d not found: ID does not exist" containerID="0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.107624 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d"} err="failed to get container status \"0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d\": rpc error: code = NotFound desc = could not find container \"0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d\": container with ID starting with 0ed47baab06cefae1bb84c6f7faea9c8dec81ad35fdfcc2e1535babf4cf8206d not found: ID does not exist" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.107988 4974 scope.go:117] "RemoveContainer" containerID="975a605a596af28fdca8e13741a0c5c34b69945fdd304b6a7a5063aa3d8efe04" Oct 13 18:30:07 crc kubenswrapper[4974]: E1013 18:30:07.108301 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"975a605a596af28fdca8e13741a0c5c34b69945fdd304b6a7a5063aa3d8efe04\": container with ID starting with 975a605a596af28fdca8e13741a0c5c34b69945fdd304b6a7a5063aa3d8efe04 not found: ID does not exist" containerID="975a605a596af28fdca8e13741a0c5c34b69945fdd304b6a7a5063aa3d8efe04" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.108342 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"975a605a596af28fdca8e13741a0c5c34b69945fdd304b6a7a5063aa3d8efe04"} err="failed to get container status \"975a605a596af28fdca8e13741a0c5c34b69945fdd304b6a7a5063aa3d8efe04\": rpc error: code = NotFound desc = could not find container \"975a605a596af28fdca8e13741a0c5c34b69945fdd304b6a7a5063aa3d8efe04\": container with ID starting with 975a605a596af28fdca8e13741a0c5c34b69945fdd304b6a7a5063aa3d8efe04 not found: ID does not exist" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.324613 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jmszb" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.451514 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jbjp\" (UniqueName: \"kubernetes.io/projected/feef59e0-044c-4c9b-b496-44c30b067129-kube-api-access-7jbjp\") pod \"feef59e0-044c-4c9b-b496-44c30b067129\" (UID: \"feef59e0-044c-4c9b-b496-44c30b067129\") " Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.454872 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/feef59e0-044c-4c9b-b496-44c30b067129-kube-api-access-7jbjp" (OuterVolumeSpecName: "kube-api-access-7jbjp") pod "feef59e0-044c-4c9b-b496-44c30b067129" (UID: "feef59e0-044c-4c9b-b496-44c30b067129"). InnerVolumeSpecName "kube-api-access-7jbjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.553638 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7jbjp\" (UniqueName: \"kubernetes.io/projected/feef59e0-044c-4c9b-b496-44c30b067129-kube-api-access-7jbjp\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:07 crc kubenswrapper[4974]: I1013 18:30:07.833145 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6887cfd9-80d7-43bc-bcb6-01b264f72d5a" path="/var/lib/kubelet/pods/6887cfd9-80d7-43bc-bcb6-01b264f72d5a/volumes" Oct 13 18:30:08 crc kubenswrapper[4974]: I1013 18:30:08.043957 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jmszb" Oct 13 18:30:08 crc kubenswrapper[4974]: I1013 18:30:08.043958 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jmszb" event={"ID":"feef59e0-044c-4c9b-b496-44c30b067129","Type":"ContainerDied","Data":"7673ba5a7e4626b518742b7e3c8b6d606eabb67b0375a62f90f53fedbb741b49"} Oct 13 18:30:08 crc kubenswrapper[4974]: I1013 18:30:08.044015 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7673ba5a7e4626b518742b7e3c8b6d606eabb67b0375a62f90f53fedbb741b49" Oct 13 18:30:09 crc kubenswrapper[4974]: I1013 18:30:09.385037 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-5z6xn" Oct 13 18:30:09 crc kubenswrapper[4974]: I1013 18:30:09.490822 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvzgp\" (UniqueName: \"kubernetes.io/projected/534daef5-5335-4b53-ba13-980e53570cd7-kube-api-access-hvzgp\") pod \"534daef5-5335-4b53-ba13-980e53570cd7\" (UID: \"534daef5-5335-4b53-ba13-980e53570cd7\") " Oct 13 18:30:09 crc kubenswrapper[4974]: I1013 18:30:09.502726 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534daef5-5335-4b53-ba13-980e53570cd7-kube-api-access-hvzgp" (OuterVolumeSpecName: "kube-api-access-hvzgp") pod "534daef5-5335-4b53-ba13-980e53570cd7" (UID: "534daef5-5335-4b53-ba13-980e53570cd7"). InnerVolumeSpecName "kube-api-access-hvzgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:09 crc kubenswrapper[4974]: I1013 18:30:09.592439 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvzgp\" (UniqueName: \"kubernetes.io/projected/534daef5-5335-4b53-ba13-980e53570cd7-kube-api-access-hvzgp\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:10 crc kubenswrapper[4974]: I1013 18:30:10.066746 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9981a58f-bf58-44c5-9da9-fbe0ef9005a9","Type":"ContainerStarted","Data":"3648f94d39bea1fd74a5338d31e824f4a7c317426ef04dd94384c7aeb6fe4431"} Oct 13 18:30:10 crc kubenswrapper[4974]: I1013 18:30:10.069210 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-5z6xn" event={"ID":"534daef5-5335-4b53-ba13-980e53570cd7","Type":"ContainerDied","Data":"31ad6f9f934046246aa78b03862989221e0477b29fcc9f2d44aa1f14ed0d9cb7"} Oct 13 18:30:10 crc kubenswrapper[4974]: I1013 18:30:10.069279 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31ad6f9f934046246aa78b03862989221e0477b29fcc9f2d44aa1f14ed0d9cb7" Oct 13 18:30:10 crc kubenswrapper[4974]: I1013 18:30:10.069239 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-5z6xn" Oct 13 18:30:10 crc kubenswrapper[4974]: I1013 18:30:10.486263 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=8.565093702 podStartE2EDuration="45.486239267s" podCreationTimestamp="2025-10-13 18:29:25 +0000 UTC" firstStartedPulling="2025-10-13 18:29:32.560478065 +0000 UTC m=+907.464844145" lastFinishedPulling="2025-10-13 18:30:09.48162363 +0000 UTC m=+944.385989710" observedRunningTime="2025-10-13 18:30:10.113319031 +0000 UTC m=+945.017685141" watchObservedRunningTime="2025-10-13 18:30:10.486239267 +0000 UTC m=+945.390605347" Oct 13 18:30:11 crc kubenswrapper[4974]: I1013 18:30:11.867571 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:11 crc kubenswrapper[4974]: I1013 18:30:11.870481 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:11 crc kubenswrapper[4974]: I1013 18:30:11.879951 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:12 crc kubenswrapper[4974]: I1013 18:30:12.092897 4974 generic.go:334] "Generic (PLEG): container finished" podID="6aff3b1c-df57-4faf-9c6b-1009d5090a13" containerID="7592af8d1ad393b28f87fcf6551d9bbf419b7d6fe1b462c422d58890b59ea53b" exitCode=0 Oct 13 18:30:12 crc kubenswrapper[4974]: I1013 18:30:12.093013 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7phmq" event={"ID":"6aff3b1c-df57-4faf-9c6b-1009d5090a13","Type":"ContainerDied","Data":"7592af8d1ad393b28f87fcf6551d9bbf419b7d6fe1b462c422d58890b59ea53b"} Oct 13 18:30:12 crc kubenswrapper[4974]: I1013 18:30:12.095095 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:12 crc kubenswrapper[4974]: I1013 18:30:12.753411 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:30:12 crc kubenswrapper[4974]: I1013 18:30:12.770745 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d7e32e29-6d51-4230-b7d5-911b0787a900-etc-swift\") pod \"swift-storage-0\" (UID: \"d7e32e29-6d51-4230-b7d5-911b0787a900\") " pod="openstack/swift-storage-0" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.012721 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.495491 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-a9aa-account-create-zkllw"] Oct 13 18:30:13 crc kubenswrapper[4974]: E1013 18:30:13.496198 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="feef59e0-044c-4c9b-b496-44c30b067129" containerName="mariadb-database-create" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.496216 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="feef59e0-044c-4c9b-b496-44c30b067129" containerName="mariadb-database-create" Oct 13 18:30:13 crc kubenswrapper[4974]: E1013 18:30:13.496235 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4ff600a-6c11-4019-93a5-efe02ce706ab" containerName="mariadb-database-create" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.496244 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4ff600a-6c11-4019-93a5-efe02ce706ab" containerName="mariadb-database-create" Oct 13 18:30:13 crc kubenswrapper[4974]: E1013 18:30:13.496260 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6887cfd9-80d7-43bc-bcb6-01b264f72d5a" containerName="init" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.496268 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6887cfd9-80d7-43bc-bcb6-01b264f72d5a" containerName="init" Oct 13 18:30:13 crc kubenswrapper[4974]: E1013 18:30:13.496300 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534daef5-5335-4b53-ba13-980e53570cd7" containerName="mariadb-database-create" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.496309 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="534daef5-5335-4b53-ba13-980e53570cd7" containerName="mariadb-database-create" Oct 13 18:30:13 crc kubenswrapper[4974]: E1013 18:30:13.496324 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6887cfd9-80d7-43bc-bcb6-01b264f72d5a" containerName="dnsmasq-dns" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.496333 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6887cfd9-80d7-43bc-bcb6-01b264f72d5a" containerName="dnsmasq-dns" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.496526 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4ff600a-6c11-4019-93a5-efe02ce706ab" containerName="mariadb-database-create" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.496550 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="feef59e0-044c-4c9b-b496-44c30b067129" containerName="mariadb-database-create" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.496574 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="6887cfd9-80d7-43bc-bcb6-01b264f72d5a" containerName="dnsmasq-dns" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.496586 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="534daef5-5335-4b53-ba13-980e53570cd7" containerName="mariadb-database-create" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.497299 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a9aa-account-create-zkllw" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.499566 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.508580 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a9aa-account-create-zkllw"] Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.534682 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.583044 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kbwz\" (UniqueName: \"kubernetes.io/projected/892baf7e-37ac-4364-9122-418e1997266d-kube-api-access-7kbwz\") pod \"keystone-a9aa-account-create-zkllw\" (UID: \"892baf7e-37ac-4364-9122-418e1997266d\") " pod="openstack/keystone-a9aa-account-create-zkllw" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.684412 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-combined-ca-bundle\") pod \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.686158 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-dispersionconf\") pod \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.686217 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6aff3b1c-df57-4faf-9c6b-1009d5090a13-etc-swift\") pod \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.686291 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpd6m\" (UniqueName: \"kubernetes.io/projected/6aff3b1c-df57-4faf-9c6b-1009d5090a13-kube-api-access-wpd6m\") pod \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.686365 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-swiftconf\") pod \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.686397 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6aff3b1c-df57-4faf-9c6b-1009d5090a13-ring-data-devices\") pod \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.686432 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aff3b1c-df57-4faf-9c6b-1009d5090a13-scripts\") pod \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\" (UID: \"6aff3b1c-df57-4faf-9c6b-1009d5090a13\") " Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.686875 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kbwz\" (UniqueName: \"kubernetes.io/projected/892baf7e-37ac-4364-9122-418e1997266d-kube-api-access-7kbwz\") pod \"keystone-a9aa-account-create-zkllw\" (UID: \"892baf7e-37ac-4364-9122-418e1997266d\") " pod="openstack/keystone-a9aa-account-create-zkllw" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.694639 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6aff3b1c-df57-4faf-9c6b-1009d5090a13-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6aff3b1c-df57-4faf-9c6b-1009d5090a13" (UID: "6aff3b1c-df57-4faf-9c6b-1009d5090a13"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.695770 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aff3b1c-df57-4faf-9c6b-1009d5090a13-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6aff3b1c-df57-4faf-9c6b-1009d5090a13" (UID: "6aff3b1c-df57-4faf-9c6b-1009d5090a13"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.706853 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6aff3b1c-df57-4faf-9c6b-1009d5090a13" (UID: "6aff3b1c-df57-4faf-9c6b-1009d5090a13"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.707575 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.721114 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6aff3b1c-df57-4faf-9c6b-1009d5090a13-kube-api-access-wpd6m" (OuterVolumeSpecName: "kube-api-access-wpd6m") pod "6aff3b1c-df57-4faf-9c6b-1009d5090a13" (UID: "6aff3b1c-df57-4faf-9c6b-1009d5090a13"). InnerVolumeSpecName "kube-api-access-wpd6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.744950 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kbwz\" (UniqueName: \"kubernetes.io/projected/892baf7e-37ac-4364-9122-418e1997266d-kube-api-access-7kbwz\") pod \"keystone-a9aa-account-create-zkllw\" (UID: \"892baf7e-37ac-4364-9122-418e1997266d\") " pod="openstack/keystone-a9aa-account-create-zkllw" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.790014 4974 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-dispersionconf\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.790048 4974 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6aff3b1c-df57-4faf-9c6b-1009d5090a13-etc-swift\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.790059 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpd6m\" (UniqueName: \"kubernetes.io/projected/6aff3b1c-df57-4faf-9c6b-1009d5090a13-kube-api-access-wpd6m\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.790069 4974 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6aff3b1c-df57-4faf-9c6b-1009d5090a13-ring-data-devices\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.792027 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6aff3b1c-df57-4faf-9c6b-1009d5090a13-scripts" (OuterVolumeSpecName: "scripts") pod "6aff3b1c-df57-4faf-9c6b-1009d5090a13" (UID: "6aff3b1c-df57-4faf-9c6b-1009d5090a13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.804800 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6aff3b1c-df57-4faf-9c6b-1009d5090a13" (UID: "6aff3b1c-df57-4faf-9c6b-1009d5090a13"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.807527 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6aff3b1c-df57-4faf-9c6b-1009d5090a13" (UID: "6aff3b1c-df57-4faf-9c6b-1009d5090a13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.847507 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-c003-account-create-242fn"] Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.847801 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a9aa-account-create-zkllw" Oct 13 18:30:13 crc kubenswrapper[4974]: E1013 18:30:13.847911 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6aff3b1c-df57-4faf-9c6b-1009d5090a13" containerName="swift-ring-rebalance" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.847929 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6aff3b1c-df57-4faf-9c6b-1009d5090a13" containerName="swift-ring-rebalance" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.848084 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="6aff3b1c-df57-4faf-9c6b-1009d5090a13" containerName="swift-ring-rebalance" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.848663 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c003-account-create-242fn" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.852022 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.872626 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c003-account-create-242fn"] Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.896936 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6aff3b1c-df57-4faf-9c6b-1009d5090a13-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.896994 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.897007 4974 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6aff3b1c-df57-4faf-9c6b-1009d5090a13-swiftconf\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:13 crc kubenswrapper[4974]: I1013 18:30:13.998206 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jnlc\" (UniqueName: \"kubernetes.io/projected/756682a7-9eac-443f-a6a0-2d979d1268d3-kube-api-access-5jnlc\") pod \"placement-c003-account-create-242fn\" (UID: \"756682a7-9eac-443f-a6a0-2d979d1268d3\") " pod="openstack/placement-c003-account-create-242fn" Oct 13 18:30:14 crc kubenswrapper[4974]: I1013 18:30:14.100379 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jnlc\" (UniqueName: \"kubernetes.io/projected/756682a7-9eac-443f-a6a0-2d979d1268d3-kube-api-access-5jnlc\") pod \"placement-c003-account-create-242fn\" (UID: \"756682a7-9eac-443f-a6a0-2d979d1268d3\") " pod="openstack/placement-c003-account-create-242fn" Oct 13 18:30:14 crc kubenswrapper[4974]: I1013 18:30:14.114967 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-7phmq" event={"ID":"6aff3b1c-df57-4faf-9c6b-1009d5090a13","Type":"ContainerDied","Data":"4aee309085826944e62400e6d521afbdc9102a6b0482704cb99c7d6546efd5eb"} Oct 13 18:30:14 crc kubenswrapper[4974]: I1013 18:30:14.115004 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aee309085826944e62400e6d521afbdc9102a6b0482704cb99c7d6546efd5eb" Oct 13 18:30:14 crc kubenswrapper[4974]: I1013 18:30:14.115053 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-7phmq" Oct 13 18:30:14 crc kubenswrapper[4974]: I1013 18:30:14.116241 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jnlc\" (UniqueName: \"kubernetes.io/projected/756682a7-9eac-443f-a6a0-2d979d1268d3-kube-api-access-5jnlc\") pod \"placement-c003-account-create-242fn\" (UID: \"756682a7-9eac-443f-a6a0-2d979d1268d3\") " pod="openstack/placement-c003-account-create-242fn" Oct 13 18:30:14 crc kubenswrapper[4974]: I1013 18:30:14.118170 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"04ffced4a86bb1b831420e001cf426d571978b64333315fb5248bd5d2490894f"} Oct 13 18:30:14 crc kubenswrapper[4974]: I1013 18:30:14.194055 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c003-account-create-242fn" Oct 13 18:30:14 crc kubenswrapper[4974]: I1013 18:30:14.240491 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 18:30:14 crc kubenswrapper[4974]: I1013 18:30:14.359758 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a9aa-account-create-zkllw"] Oct 13 18:30:14 crc kubenswrapper[4974]: W1013 18:30:14.363966 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod892baf7e_37ac_4364_9122_418e1997266d.slice/crio-0616640c9b4e21f94a2bfeb33adeed41e849b35d47385ca605175ee97b5e95a0 WatchSource:0}: Error finding container 0616640c9b4e21f94a2bfeb33adeed41e849b35d47385ca605175ee97b5e95a0: Status 404 returned error can't find the container with id 0616640c9b4e21f94a2bfeb33adeed41e849b35d47385ca605175ee97b5e95a0 Oct 13 18:30:14 crc kubenswrapper[4974]: I1013 18:30:14.675160 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-r5pdv" podUID="c233290a-abb8-4429-8500-f4ec541ccc21" containerName="ovn-controller" probeResult="failure" output=< Oct 13 18:30:14 crc kubenswrapper[4974]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 13 18:30:14 crc kubenswrapper[4974]: > Oct 13 18:30:14 crc kubenswrapper[4974]: I1013 18:30:14.678631 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-c003-account-create-242fn"] Oct 13 18:30:15 crc kubenswrapper[4974]: I1013 18:30:15.134361 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a9aa-account-create-zkllw" event={"ID":"892baf7e-37ac-4364-9122-418e1997266d","Type":"ContainerStarted","Data":"0616640c9b4e21f94a2bfeb33adeed41e849b35d47385ca605175ee97b5e95a0"} Oct 13 18:30:15 crc kubenswrapper[4974]: I1013 18:30:15.138234 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c003-account-create-242fn" event={"ID":"756682a7-9eac-443f-a6a0-2d979d1268d3","Type":"ContainerStarted","Data":"3d2bdf0f869528667af8ee377afb22e77a96132e1787e037e1934c4274b8e973"} Oct 13 18:30:15 crc kubenswrapper[4974]: I1013 18:30:15.138536 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="thanos-sidecar" containerID="cri-o://3648f94d39bea1fd74a5338d31e824f4a7c317426ef04dd94384c7aeb6fe4431" gracePeriod=600 Oct 13 18:30:15 crc kubenswrapper[4974]: I1013 18:30:15.138569 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="config-reloader" containerID="cri-o://30e5efccd116a18e824bf0a06f3202193f9a04cc1e702e3c59c8e64391f9d065" gracePeriod=600 Oct 13 18:30:15 crc kubenswrapper[4974]: I1013 18:30:15.139099 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="prometheus" containerID="cri-o://b0aacf843ad242659bec4fa2c523ef04dbdaa3aab4e7148b88e9cfff2b3dfd45" gracePeriod=600 Oct 13 18:30:15 crc kubenswrapper[4974]: I1013 18:30:15.664312 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-510b-account-create-dtxwp"] Oct 13 18:30:15 crc kubenswrapper[4974]: I1013 18:30:15.665817 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-510b-account-create-dtxwp" Oct 13 18:30:15 crc kubenswrapper[4974]: I1013 18:30:15.667332 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Oct 13 18:30:15 crc kubenswrapper[4974]: I1013 18:30:15.675470 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-510b-account-create-dtxwp"] Oct 13 18:30:15 crc kubenswrapper[4974]: I1013 18:30:15.863508 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qtdd\" (UniqueName: \"kubernetes.io/projected/76e96c17-0dd3-4925-b4f5-cf2ee96b2868-kube-api-access-5qtdd\") pod \"watcher-510b-account-create-dtxwp\" (UID: \"76e96c17-0dd3-4925-b4f5-cf2ee96b2868\") " pod="openstack/watcher-510b-account-create-dtxwp" Oct 13 18:30:15 crc kubenswrapper[4974]: I1013 18:30:15.965860 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qtdd\" (UniqueName: \"kubernetes.io/projected/76e96c17-0dd3-4925-b4f5-cf2ee96b2868-kube-api-access-5qtdd\") pod \"watcher-510b-account-create-dtxwp\" (UID: \"76e96c17-0dd3-4925-b4f5-cf2ee96b2868\") " pod="openstack/watcher-510b-account-create-dtxwp" Oct 13 18:30:15 crc kubenswrapper[4974]: I1013 18:30:15.986520 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qtdd\" (UniqueName: \"kubernetes.io/projected/76e96c17-0dd3-4925-b4f5-cf2ee96b2868-kube-api-access-5qtdd\") pod \"watcher-510b-account-create-dtxwp\" (UID: \"76e96c17-0dd3-4925-b4f5-cf2ee96b2868\") " pod="openstack/watcher-510b-account-create-dtxwp" Oct 13 18:30:16 crc kubenswrapper[4974]: I1013 18:30:16.002445 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-510b-account-create-dtxwp" Oct 13 18:30:16 crc kubenswrapper[4974]: I1013 18:30:16.501359 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-510b-account-create-dtxwp"] Oct 13 18:30:16 crc kubenswrapper[4974]: W1013 18:30:16.509465 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod76e96c17_0dd3_4925_b4f5_cf2ee96b2868.slice/crio-2f093f0665cbf1da0c8e9f44382474b466a857faf4d1880b410077180185ae9a WatchSource:0}: Error finding container 2f093f0665cbf1da0c8e9f44382474b466a857faf4d1880b410077180185ae9a: Status 404 returned error can't find the container with id 2f093f0665cbf1da0c8e9f44382474b466a857faf4d1880b410077180185ae9a Oct 13 18:30:16 crc kubenswrapper[4974]: I1013 18:30:16.869196 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="prometheus" probeResult="failure" output="HTTP probe failed with statuscode: 503" Oct 13 18:30:17 crc kubenswrapper[4974]: I1013 18:30:17.162360 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-510b-account-create-dtxwp" event={"ID":"76e96c17-0dd3-4925-b4f5-cf2ee96b2868","Type":"ContainerStarted","Data":"2f093f0665cbf1da0c8e9f44382474b466a857faf4d1880b410077180185ae9a"} Oct 13 18:30:17 crc kubenswrapper[4974]: I1013 18:30:17.165204 4974 generic.go:334] "Generic (PLEG): container finished" podID="a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" containerID="f9feccc6065d63f691d7fbab7f9eab65f08587127c273cffc84fd6e683cab7ec" exitCode=0 Oct 13 18:30:17 crc kubenswrapper[4974]: I1013 18:30:17.165282 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f","Type":"ContainerDied","Data":"f9feccc6065d63f691d7fbab7f9eab65f08587127c273cffc84fd6e683cab7ec"} Oct 13 18:30:17 crc kubenswrapper[4974]: I1013 18:30:17.167566 4974 generic.go:334] "Generic (PLEG): container finished" podID="a3edaa1a-d213-473f-963a-3bfea41226ec" containerID="0e3d86bf6a5876919060cdf9a8524e67172b2a586cadba6d8a0d3e5c0cd22c2a" exitCode=0 Oct 13 18:30:17 crc kubenswrapper[4974]: I1013 18:30:17.167634 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"a3edaa1a-d213-473f-963a-3bfea41226ec","Type":"ContainerDied","Data":"0e3d86bf6a5876919060cdf9a8524e67172b2a586cadba6d8a0d3e5c0cd22c2a"} Oct 13 18:30:17 crc kubenswrapper[4974]: I1013 18:30:17.171416 4974 generic.go:334] "Generic (PLEG): container finished" podID="97f813f5-f34c-4b82-b066-032f8b795049" containerID="9329d18d7ac353654a37f475ebbc3089790ec57af55580074f0df82cc9e8e5d2" exitCode=0 Oct 13 18:30:17 crc kubenswrapper[4974]: I1013 18:30:17.171477 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97f813f5-f34c-4b82-b066-032f8b795049","Type":"ContainerDied","Data":"9329d18d7ac353654a37f475ebbc3089790ec57af55580074f0df82cc9e8e5d2"} Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.197961 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c003-account-create-242fn" event={"ID":"756682a7-9eac-443f-a6a0-2d979d1268d3","Type":"ContainerStarted","Data":"73f9fdfd8f691ddd6d02816c89d10cb653aa773d81715f9cf6e195f5a320f77f"} Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.200616 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-510b-account-create-dtxwp" event={"ID":"76e96c17-0dd3-4925-b4f5-cf2ee96b2868","Type":"ContainerStarted","Data":"b6a59ad96a06bff61b6c41d0f84c5b47a2ef3391d0dc087811c2809431789612"} Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.206522 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f","Type":"ContainerStarted","Data":"f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9"} Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.207386 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.209638 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"a3edaa1a-d213-473f-963a-3bfea41226ec","Type":"ContainerStarted","Data":"bc15262d160fc11c91eb456e6392d04538a1bb62de5bb96b98459855d67bf8df"} Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.210161 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.212594 4974 generic.go:334] "Generic (PLEG): container finished" podID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerID="3648f94d39bea1fd74a5338d31e824f4a7c317426ef04dd94384c7aeb6fe4431" exitCode=0 Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.212619 4974 generic.go:334] "Generic (PLEG): container finished" podID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerID="30e5efccd116a18e824bf0a06f3202193f9a04cc1e702e3c59c8e64391f9d065" exitCode=0 Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.212632 4974 generic.go:334] "Generic (PLEG): container finished" podID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerID="b0aacf843ad242659bec4fa2c523ef04dbdaa3aab4e7148b88e9cfff2b3dfd45" exitCode=0 Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.212687 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9981a58f-bf58-44c5-9da9-fbe0ef9005a9","Type":"ContainerDied","Data":"3648f94d39bea1fd74a5338d31e824f4a7c317426ef04dd94384c7aeb6fe4431"} Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.212708 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9981a58f-bf58-44c5-9da9-fbe0ef9005a9","Type":"ContainerDied","Data":"30e5efccd116a18e824bf0a06f3202193f9a04cc1e702e3c59c8e64391f9d065"} Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.212720 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9981a58f-bf58-44c5-9da9-fbe0ef9005a9","Type":"ContainerDied","Data":"b0aacf843ad242659bec4fa2c523ef04dbdaa3aab4e7148b88e9cfff2b3dfd45"} Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.217988 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a9aa-account-create-zkllw" event={"ID":"892baf7e-37ac-4364-9122-418e1997266d","Type":"ContainerStarted","Data":"016d41110f2a4b9c980c6704250037c47564e05b346e841c0eb611c384e8cac5"} Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.220052 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-c003-account-create-242fn" podStartSLOduration=5.220035355 podStartE2EDuration="5.220035355s" podCreationTimestamp="2025-10-13 18:30:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:30:18.218104 +0000 UTC m=+953.122470090" watchObservedRunningTime="2025-10-13 18:30:18.220035355 +0000 UTC m=+953.124401445" Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.230304 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97f813f5-f34c-4b82-b066-032f8b795049","Type":"ContainerStarted","Data":"08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241"} Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.230544 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.249782 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=50.319605037 podStartE2EDuration="1m0.249767773s" podCreationTimestamp="2025-10-13 18:29:18 +0000 UTC" firstStartedPulling="2025-10-13 18:29:32.083070622 +0000 UTC m=+906.987436702" lastFinishedPulling="2025-10-13 18:29:42.013233348 +0000 UTC m=+916.917599438" observedRunningTime="2025-10-13 18:30:18.243890117 +0000 UTC m=+953.148256197" watchObservedRunningTime="2025-10-13 18:30:18.249767773 +0000 UTC m=+953.154133853" Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.290287 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-a9aa-account-create-zkllw" podStartSLOduration=5.290273535 podStartE2EDuration="5.290273535s" podCreationTimestamp="2025-10-13 18:30:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:30:18.290260245 +0000 UTC m=+953.194626325" watchObservedRunningTime="2025-10-13 18:30:18.290273535 +0000 UTC m=+953.194639615" Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.295254 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=50.197785592 podStartE2EDuration="1m0.295245355s" podCreationTimestamp="2025-10-13 18:29:18 +0000 UTC" firstStartedPulling="2025-10-13 18:29:31.424525433 +0000 UTC m=+906.328891533" lastFinishedPulling="2025-10-13 18:29:41.521985216 +0000 UTC m=+916.426351296" observedRunningTime="2025-10-13 18:30:18.273967405 +0000 UTC m=+953.178333485" watchObservedRunningTime="2025-10-13 18:30:18.295245355 +0000 UTC m=+953.199611435" Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.318031 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=49.508118016 podStartE2EDuration="59.318009707s" podCreationTimestamp="2025-10-13 18:29:19 +0000 UTC" firstStartedPulling="2025-10-13 18:29:32.203462 +0000 UTC m=+907.107828080" lastFinishedPulling="2025-10-13 18:29:42.013353691 +0000 UTC m=+916.917719771" observedRunningTime="2025-10-13 18:30:18.312112801 +0000 UTC m=+953.216478881" watchObservedRunningTime="2025-10-13 18:30:18.318009707 +0000 UTC m=+953.222375807" Oct 13 18:30:18 crc kubenswrapper[4974]: I1013 18:30:18.916195 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.021202 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-tls-assets\") pod \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.021248 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-thanos-prometheus-http-client-file\") pod \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.021310 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-config\") pod \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.021488 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.021574 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-config-out\") pod \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.021683 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gmkn6\" (UniqueName: \"kubernetes.io/projected/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-kube-api-access-gmkn6\") pod \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.021739 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-prometheus-metric-storage-rulefiles-0\") pod \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.021780 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-web-config\") pod \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\" (UID: \"9981a58f-bf58-44c5-9da9-fbe0ef9005a9\") " Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.027331 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "9981a58f-bf58-44c5-9da9-fbe0ef9005a9" (UID: "9981a58f-bf58-44c5-9da9-fbe0ef9005a9"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.035793 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-kube-api-access-gmkn6" (OuterVolumeSpecName: "kube-api-access-gmkn6") pod "9981a58f-bf58-44c5-9da9-fbe0ef9005a9" (UID: "9981a58f-bf58-44c5-9da9-fbe0ef9005a9"). InnerVolumeSpecName "kube-api-access-gmkn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.043832 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-config" (OuterVolumeSpecName: "config") pod "9981a58f-bf58-44c5-9da9-fbe0ef9005a9" (UID: "9981a58f-bf58-44c5-9da9-fbe0ef9005a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.043878 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "9981a58f-bf58-44c5-9da9-fbe0ef9005a9" (UID: "9981a58f-bf58-44c5-9da9-fbe0ef9005a9"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.043928 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "9981a58f-bf58-44c5-9da9-fbe0ef9005a9" (UID: "9981a58f-bf58-44c5-9da9-fbe0ef9005a9"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.047022 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-config-out" (OuterVolumeSpecName: "config-out") pod "9981a58f-bf58-44c5-9da9-fbe0ef9005a9" (UID: "9981a58f-bf58-44c5-9da9-fbe0ef9005a9"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.058429 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-web-config" (OuterVolumeSpecName: "web-config") pod "9981a58f-bf58-44c5-9da9-fbe0ef9005a9" (UID: "9981a58f-bf58-44c5-9da9-fbe0ef9005a9"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.079710 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "9981a58f-bf58-44c5-9da9-fbe0ef9005a9" (UID: "9981a58f-bf58-44c5-9da9-fbe0ef9005a9"). InnerVolumeSpecName "pvc-1fd59f2a-5ebb-4374-98ce-075e681db308". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.129017 4974 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.129052 4974 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-web-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.129065 4974 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.129075 4974 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.129085 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.129112 4974 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") on node \"crc\" " Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.129122 4974 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-config-out\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.129132 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gmkn6\" (UniqueName: \"kubernetes.io/projected/9981a58f-bf58-44c5-9da9-fbe0ef9005a9-kube-api-access-gmkn6\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.160558 4974 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.160924 4974 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1fd59f2a-5ebb-4374-98ce-075e681db308" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308") on node "crc" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.230164 4974 reconciler_common.go:293] "Volume detached for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.238672 4974 generic.go:334] "Generic (PLEG): container finished" podID="892baf7e-37ac-4364-9122-418e1997266d" containerID="016d41110f2a4b9c980c6704250037c47564e05b346e841c0eb611c384e8cac5" exitCode=0 Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.238737 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a9aa-account-create-zkllw" event={"ID":"892baf7e-37ac-4364-9122-418e1997266d","Type":"ContainerDied","Data":"016d41110f2a4b9c980c6704250037c47564e05b346e841c0eb611c384e8cac5"} Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.240464 4974 generic.go:334] "Generic (PLEG): container finished" podID="756682a7-9eac-443f-a6a0-2d979d1268d3" containerID="73f9fdfd8f691ddd6d02816c89d10cb653aa773d81715f9cf6e195f5a320f77f" exitCode=0 Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.240523 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c003-account-create-242fn" event={"ID":"756682a7-9eac-443f-a6a0-2d979d1268d3","Type":"ContainerDied","Data":"73f9fdfd8f691ddd6d02816c89d10cb653aa773d81715f9cf6e195f5a320f77f"} Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.242487 4974 generic.go:334] "Generic (PLEG): container finished" podID="76e96c17-0dd3-4925-b4f5-cf2ee96b2868" containerID="b6a59ad96a06bff61b6c41d0f84c5b47a2ef3391d0dc087811c2809431789612" exitCode=0 Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.242532 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-510b-account-create-dtxwp" event={"ID":"76e96c17-0dd3-4925-b4f5-cf2ee96b2868","Type":"ContainerDied","Data":"b6a59ad96a06bff61b6c41d0f84c5b47a2ef3391d0dc087811c2809431789612"} Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.243998 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"fc5dabe48d40831d69264290ab1faf820ddaf5f6cd77c3dfb278a0aa803b3492"} Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.246183 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"9981a58f-bf58-44c5-9da9-fbe0ef9005a9","Type":"ContainerDied","Data":"31297244d1308c0d63f1a2d04fc79b889566cc41398ad62d52beb138177ea30a"} Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.246249 4974 scope.go:117] "RemoveContainer" containerID="3648f94d39bea1fd74a5338d31e824f4a7c317426ef04dd94384c7aeb6fe4431" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.246378 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.261775 4974 scope.go:117] "RemoveContainer" containerID="30e5efccd116a18e824bf0a06f3202193f9a04cc1e702e3c59c8e64391f9d065" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.280953 4974 scope.go:117] "RemoveContainer" containerID="b0aacf843ad242659bec4fa2c523ef04dbdaa3aab4e7148b88e9cfff2b3dfd45" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.302675 4974 scope.go:117] "RemoveContainer" containerID="fef82051b786278429ce9f99d7d6776b6f30b36730555d1be52d8d010a0455d8" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.306741 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.318750 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.347143 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 18:30:19 crc kubenswrapper[4974]: E1013 18:30:19.347457 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="prometheus" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.347476 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="prometheus" Oct 13 18:30:19 crc kubenswrapper[4974]: E1013 18:30:19.347490 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="init-config-reloader" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.347496 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="init-config-reloader" Oct 13 18:30:19 crc kubenswrapper[4974]: E1013 18:30:19.347516 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="config-reloader" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.347522 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="config-reloader" Oct 13 18:30:19 crc kubenswrapper[4974]: E1013 18:30:19.347541 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="thanos-sidecar" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.347547 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="thanos-sidecar" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.347708 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="thanos-sidecar" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.347728 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="prometheus" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.347744 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" containerName="config-reloader" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.354568 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.358533 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.358621 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-97gxg" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.358754 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.358791 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.359981 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.360114 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.362257 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.366996 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.433551 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.433628 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-config\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.433683 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.433713 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwc77\" (UniqueName: \"kubernetes.io/projected/7f3cf41f-a929-4f3c-9063-682924915904-kube-api-access-pwc77\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.433753 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.433791 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7f3cf41f-a929-4f3c-9063-682924915904-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.433836 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.433859 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.433880 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.433939 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7f3cf41f-a929-4f3c-9063-682924915904-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.433998 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7f3cf41f-a929-4f3c-9063-682924915904-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.535179 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7f3cf41f-a929-4f3c-9063-682924915904-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.535249 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.535292 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-config\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.535322 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.535392 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwc77\" (UniqueName: \"kubernetes.io/projected/7f3cf41f-a929-4f3c-9063-682924915904-kube-api-access-pwc77\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.535605 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.535683 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7f3cf41f-a929-4f3c-9063-682924915904-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.536094 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.536138 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.536158 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.536181 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7f3cf41f-a929-4f3c-9063-682924915904-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.536405 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7f3cf41f-a929-4f3c-9063-682924915904-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.540026 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.540319 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7f3cf41f-a929-4f3c-9063-682924915904-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.540785 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7f3cf41f-a929-4f3c-9063-682924915904-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.542129 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.542257 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-config\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.545496 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.546582 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.546600 4974 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.546550 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.546691 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e453070dec09825da50fcd48128605195703a3e04c8868309f22a520ea4896c6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.558041 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwc77\" (UniqueName: \"kubernetes.io/projected/7f3cf41f-a929-4f3c-9063-682924915904-kube-api-access-pwc77\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.584964 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"prometheus-metric-storage-0\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.657053 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-r5pdv" podUID="c233290a-abb8-4429-8500-f4ec541ccc21" containerName="ovn-controller" probeResult="failure" output=< Oct 13 18:30:19 crc kubenswrapper[4974]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Oct 13 18:30:19 crc kubenswrapper[4974]: > Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.676443 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.681133 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.695515 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cxs58" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.871088 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9981a58f-bf58-44c5-9da9-fbe0ef9005a9" path="/var/lib/kubelet/pods/9981a58f-bf58-44c5-9da9-fbe0ef9005a9/volumes" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.907487 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r5pdv-config-5dmh7"] Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.908617 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.911837 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.918344 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r5pdv-config-5dmh7"] Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.942980 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-log-ovn\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.943018 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1329160a-6a42-45dd-a36b-1b35e28fdc4c-scripts\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.943073 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-run\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.943089 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-run-ovn\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.943128 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4jcn\" (UniqueName: \"kubernetes.io/projected/1329160a-6a42-45dd-a36b-1b35e28fdc4c-kube-api-access-f4jcn\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:19 crc kubenswrapper[4974]: I1013 18:30:19.943186 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1329160a-6a42-45dd-a36b-1b35e28fdc4c-additional-scripts\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.044568 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-run\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.044610 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-run-ovn\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.044667 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4jcn\" (UniqueName: \"kubernetes.io/projected/1329160a-6a42-45dd-a36b-1b35e28fdc4c-kube-api-access-f4jcn\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.044728 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1329160a-6a42-45dd-a36b-1b35e28fdc4c-additional-scripts\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.044757 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-log-ovn\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.044780 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1329160a-6a42-45dd-a36b-1b35e28fdc4c-scripts\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.044987 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-run\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.045358 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-run-ovn\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.045789 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-log-ovn\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.045986 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1329160a-6a42-45dd-a36b-1b35e28fdc4c-additional-scripts\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.046861 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1329160a-6a42-45dd-a36b-1b35e28fdc4c-scripts\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.069280 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4jcn\" (UniqueName: \"kubernetes.io/projected/1329160a-6a42-45dd-a36b-1b35e28fdc4c-kube-api-access-f4jcn\") pod \"ovn-controller-r5pdv-config-5dmh7\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.234263 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.243039 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.272844 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"e8e728501f5a98f9bb0284fc9fa697d5988a8ba623135587f255b357c2b5fd28"} Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.272905 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"2d819b0998db675697c209120a82cb8a124eb12fa0c29d258d9edcd1096ba54c"} Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.272916 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"c304dd0bca9b9c9807c9cab42b788866e2f8455e9782b269e2d19ce258129a23"} Oct 13 18:30:20 crc kubenswrapper[4974]: W1013 18:30:20.740577 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f3cf41f_a929_4f3c_9063_682924915904.slice/crio-ea9183d20ca0204634cade5fcfc457aaa5f1a6e3b5a910e030e4ececcc8003c6 WatchSource:0}: Error finding container ea9183d20ca0204634cade5fcfc457aaa5f1a6e3b5a910e030e4ececcc8003c6: Status 404 returned error can't find the container with id ea9183d20ca0204634cade5fcfc457aaa5f1a6e3b5a910e030e4ececcc8003c6 Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.871028 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-510b-account-create-dtxwp" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.884739 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c003-account-create-242fn" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.927969 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a9aa-account-create-zkllw" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.971036 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qtdd\" (UniqueName: \"kubernetes.io/projected/76e96c17-0dd3-4925-b4f5-cf2ee96b2868-kube-api-access-5qtdd\") pod \"76e96c17-0dd3-4925-b4f5-cf2ee96b2868\" (UID: \"76e96c17-0dd3-4925-b4f5-cf2ee96b2868\") " Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.971128 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kbwz\" (UniqueName: \"kubernetes.io/projected/892baf7e-37ac-4364-9122-418e1997266d-kube-api-access-7kbwz\") pod \"892baf7e-37ac-4364-9122-418e1997266d\" (UID: \"892baf7e-37ac-4364-9122-418e1997266d\") " Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.971230 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jnlc\" (UniqueName: \"kubernetes.io/projected/756682a7-9eac-443f-a6a0-2d979d1268d3-kube-api-access-5jnlc\") pod \"756682a7-9eac-443f-a6a0-2d979d1268d3\" (UID: \"756682a7-9eac-443f-a6a0-2d979d1268d3\") " Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.976754 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/756682a7-9eac-443f-a6a0-2d979d1268d3-kube-api-access-5jnlc" (OuterVolumeSpecName: "kube-api-access-5jnlc") pod "756682a7-9eac-443f-a6a0-2d979d1268d3" (UID: "756682a7-9eac-443f-a6a0-2d979d1268d3"). InnerVolumeSpecName "kube-api-access-5jnlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.977239 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/892baf7e-37ac-4364-9122-418e1997266d-kube-api-access-7kbwz" (OuterVolumeSpecName: "kube-api-access-7kbwz") pod "892baf7e-37ac-4364-9122-418e1997266d" (UID: "892baf7e-37ac-4364-9122-418e1997266d"). InnerVolumeSpecName "kube-api-access-7kbwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:20 crc kubenswrapper[4974]: I1013 18:30:20.977321 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76e96c17-0dd3-4925-b4f5-cf2ee96b2868-kube-api-access-5qtdd" (OuterVolumeSpecName: "kube-api-access-5qtdd") pod "76e96c17-0dd3-4925-b4f5-cf2ee96b2868" (UID: "76e96c17-0dd3-4925-b4f5-cf2ee96b2868"). InnerVolumeSpecName "kube-api-access-5qtdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.073350 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qtdd\" (UniqueName: \"kubernetes.io/projected/76e96c17-0dd3-4925-b4f5-cf2ee96b2868-kube-api-access-5qtdd\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.073606 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kbwz\" (UniqueName: \"kubernetes.io/projected/892baf7e-37ac-4364-9122-418e1997266d-kube-api-access-7kbwz\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.073615 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jnlc\" (UniqueName: \"kubernetes.io/projected/756682a7-9eac-443f-a6a0-2d979d1268d3-kube-api-access-5jnlc\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.295371 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a9aa-account-create-zkllw" Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.295362 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a9aa-account-create-zkllw" event={"ID":"892baf7e-37ac-4364-9122-418e1997266d","Type":"ContainerDied","Data":"0616640c9b4e21f94a2bfeb33adeed41e849b35d47385ca605175ee97b5e95a0"} Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.296295 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0616640c9b4e21f94a2bfeb33adeed41e849b35d47385ca605175ee97b5e95a0" Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.297617 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r5pdv-config-5dmh7"] Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.302763 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7f3cf41f-a929-4f3c-9063-682924915904","Type":"ContainerStarted","Data":"ea9183d20ca0204634cade5fcfc457aaa5f1a6e3b5a910e030e4ececcc8003c6"} Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.303903 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-c003-account-create-242fn" event={"ID":"756682a7-9eac-443f-a6a0-2d979d1268d3","Type":"ContainerDied","Data":"3d2bdf0f869528667af8ee377afb22e77a96132e1787e037e1934c4274b8e973"} Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.303925 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-c003-account-create-242fn" Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.303936 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d2bdf0f869528667af8ee377afb22e77a96132e1787e037e1934c4274b8e973" Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.312663 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-510b-account-create-dtxwp" Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.312669 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-510b-account-create-dtxwp" event={"ID":"76e96c17-0dd3-4925-b4f5-cf2ee96b2868","Type":"ContainerDied","Data":"2f093f0665cbf1da0c8e9f44382474b466a857faf4d1880b410077180185ae9a"} Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.312832 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f093f0665cbf1da0c8e9f44382474b466a857faf4d1880b410077180185ae9a" Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.318149 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"7599675fa818007a2d91aa06596d10d381f0f8e0afd5c647d832d6f7279de592"} Oct 13 18:30:21 crc kubenswrapper[4974]: I1013 18:30:21.318178 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"77b0b0d3a4de89d76f72a19aac489a3075526eb6e2e91a301b4ac45f45732ad7"} Oct 13 18:30:22 crc kubenswrapper[4974]: I1013 18:30:22.331727 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"49f9a18cf0f506c780b71472a3b7bf36c5647d5fa71c2983e623ca59a950aab5"} Oct 13 18:30:22 crc kubenswrapper[4974]: I1013 18:30:22.332270 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"e438476c35e60282bb299355fd8edfd6a2a9716df90321a93bbd8594abb446f6"} Oct 13 18:30:22 crc kubenswrapper[4974]: I1013 18:30:22.338190 4974 generic.go:334] "Generic (PLEG): container finished" podID="1329160a-6a42-45dd-a36b-1b35e28fdc4c" containerID="b2b9cfeee6089bf9ad2b4ad83cff4fd85575fca3cbee58b630a2b257678224ed" exitCode=0 Oct 13 18:30:22 crc kubenswrapper[4974]: I1013 18:30:22.338233 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-5dmh7" event={"ID":"1329160a-6a42-45dd-a36b-1b35e28fdc4c","Type":"ContainerDied","Data":"b2b9cfeee6089bf9ad2b4ad83cff4fd85575fca3cbee58b630a2b257678224ed"} Oct 13 18:30:22 crc kubenswrapper[4974]: I1013 18:30:22.338259 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-5dmh7" event={"ID":"1329160a-6a42-45dd-a36b-1b35e28fdc4c","Type":"ContainerStarted","Data":"4aa83281bfc9d279904245a4b42f2a0ac171264c5476c1e2d1bdf6dedddd156d"} Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.350120 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"141dda9de9a3f08fc90190716f54ae49c42655b4fbcdff5306d8229eba1ab44f"} Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.350458 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"a19492a2f2a136ad7001036340cfa590743017582c0fbac40fb9e2a6014b97df"} Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.843006 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.917515 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1329160a-6a42-45dd-a36b-1b35e28fdc4c-additional-scripts\") pod \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.917637 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1329160a-6a42-45dd-a36b-1b35e28fdc4c-scripts\") pod \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.918400 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1329160a-6a42-45dd-a36b-1b35e28fdc4c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "1329160a-6a42-45dd-a36b-1b35e28fdc4c" (UID: "1329160a-6a42-45dd-a36b-1b35e28fdc4c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.918934 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1329160a-6a42-45dd-a36b-1b35e28fdc4c-scripts" (OuterVolumeSpecName: "scripts") pod "1329160a-6a42-45dd-a36b-1b35e28fdc4c" (UID: "1329160a-6a42-45dd-a36b-1b35e28fdc4c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.919032 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-run\") pod \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.919101 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-run" (OuterVolumeSpecName: "var-run") pod "1329160a-6a42-45dd-a36b-1b35e28fdc4c" (UID: "1329160a-6a42-45dd-a36b-1b35e28fdc4c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.919196 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-run-ovn\") pod \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.919263 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "1329160a-6a42-45dd-a36b-1b35e28fdc4c" (UID: "1329160a-6a42-45dd-a36b-1b35e28fdc4c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.919360 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-log-ovn\") pod \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.919427 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "1329160a-6a42-45dd-a36b-1b35e28fdc4c" (UID: "1329160a-6a42-45dd-a36b-1b35e28fdc4c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.919477 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4jcn\" (UniqueName: \"kubernetes.io/projected/1329160a-6a42-45dd-a36b-1b35e28fdc4c-kube-api-access-f4jcn\") pod \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\" (UID: \"1329160a-6a42-45dd-a36b-1b35e28fdc4c\") " Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.925568 4974 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/1329160a-6a42-45dd-a36b-1b35e28fdc4c-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.925605 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1329160a-6a42-45dd-a36b-1b35e28fdc4c-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.925624 4974 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-run\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.925637 4974 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.925646 4974 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/1329160a-6a42-45dd-a36b-1b35e28fdc4c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:23 crc kubenswrapper[4974]: I1013 18:30:23.928721 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1329160a-6a42-45dd-a36b-1b35e28fdc4c-kube-api-access-f4jcn" (OuterVolumeSpecName: "kube-api-access-f4jcn") pod "1329160a-6a42-45dd-a36b-1b35e28fdc4c" (UID: "1329160a-6a42-45dd-a36b-1b35e28fdc4c"). InnerVolumeSpecName "kube-api-access-f4jcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.027620 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4jcn\" (UniqueName: \"kubernetes.io/projected/1329160a-6a42-45dd-a36b-1b35e28fdc4c-kube-api-access-f4jcn\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.362815 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"b511af103bc8fbd7d8da27fa42d80a594165c631478db2ede8cb97bd29c96045"} Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.362859 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"1685c91740feda34b083ae3f6909f2c337d761e11f193edcb427aa71323cf4b3"} Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.362871 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"1f69d3b64d0500eaec95e763ccc6a347987512149380baae01b0bb04d00d55bf"} Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.362879 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"17b55355bc0c68db86879a7093a30e6023a92cfecbd8fe1fb91a4a4762356f9f"} Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.362887 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"d7e32e29-6d51-4230-b7d5-911b0787a900","Type":"ContainerStarted","Data":"a8e2c3da5ef6bc350c34253940b23bbe6bf32bc84c43fe619f0f8269a90dc2df"} Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.365680 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-5dmh7" event={"ID":"1329160a-6a42-45dd-a36b-1b35e28fdc4c","Type":"ContainerDied","Data":"4aa83281bfc9d279904245a4b42f2a0ac171264c5476c1e2d1bdf6dedddd156d"} Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.365712 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aa83281bfc9d279904245a4b42f2a0ac171264c5476c1e2d1bdf6dedddd156d" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.365759 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-5dmh7" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.376228 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7f3cf41f-a929-4f3c-9063-682924915904","Type":"ContainerStarted","Data":"6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd"} Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.413556 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.795503637 podStartE2EDuration="29.413538988s" podCreationTimestamp="2025-10-13 18:29:55 +0000 UTC" firstStartedPulling="2025-10-13 18:30:13.720929649 +0000 UTC m=+948.625295729" lastFinishedPulling="2025-10-13 18:30:22.338965 +0000 UTC m=+957.243331080" observedRunningTime="2025-10-13 18:30:24.40722907 +0000 UTC m=+959.311595150" watchObservedRunningTime="2025-10-13 18:30:24.413538988 +0000 UTC m=+959.317905068" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.656589 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-r5pdv" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.737692 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f67486d89-6nn4q"] Oct 13 18:30:24 crc kubenswrapper[4974]: E1013 18:30:24.738089 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1329160a-6a42-45dd-a36b-1b35e28fdc4c" containerName="ovn-config" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.738113 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="1329160a-6a42-45dd-a36b-1b35e28fdc4c" containerName="ovn-config" Oct 13 18:30:24 crc kubenswrapper[4974]: E1013 18:30:24.738155 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="892baf7e-37ac-4364-9122-418e1997266d" containerName="mariadb-account-create" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.738164 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="892baf7e-37ac-4364-9122-418e1997266d" containerName="mariadb-account-create" Oct 13 18:30:24 crc kubenswrapper[4974]: E1013 18:30:24.738423 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="756682a7-9eac-443f-a6a0-2d979d1268d3" containerName="mariadb-account-create" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.738434 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="756682a7-9eac-443f-a6a0-2d979d1268d3" containerName="mariadb-account-create" Oct 13 18:30:24 crc kubenswrapper[4974]: E1013 18:30:24.738450 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76e96c17-0dd3-4925-b4f5-cf2ee96b2868" containerName="mariadb-account-create" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.738458 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="76e96c17-0dd3-4925-b4f5-cf2ee96b2868" containerName="mariadb-account-create" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.738645 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="76e96c17-0dd3-4925-b4f5-cf2ee96b2868" containerName="mariadb-account-create" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.738691 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="1329160a-6a42-45dd-a36b-1b35e28fdc4c" containerName="ovn-config" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.738713 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="756682a7-9eac-443f-a6a0-2d979d1268d3" containerName="mariadb-account-create" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.738722 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="892baf7e-37ac-4364-9122-418e1997266d" containerName="mariadb-account-create" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.741414 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.743036 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.748614 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f67486d89-6nn4q"] Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.842097 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-ovsdbserver-nb\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.842138 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-config\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.842294 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-dns-swift-storage-0\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.842341 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-dns-svc\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.842363 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhts5\" (UniqueName: \"kubernetes.io/projected/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-kube-api-access-dhts5\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.842393 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-ovsdbserver-sb\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.938888 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-r5pdv-config-5dmh7"] Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.943565 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-dns-swift-storage-0\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.943631 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-dns-svc\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.943670 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhts5\" (UniqueName: \"kubernetes.io/projected/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-kube-api-access-dhts5\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.943695 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-ovsdbserver-sb\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.943743 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-ovsdbserver-nb\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.943762 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-config\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.944891 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-config\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.944927 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-dns-svc\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.945072 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-ovsdbserver-sb\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.945300 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-ovsdbserver-nb\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.945437 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-dns-swift-storage-0\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.946845 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-r5pdv-config-5dmh7"] Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.961417 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhts5\" (UniqueName: \"kubernetes.io/projected/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-kube-api-access-dhts5\") pod \"dnsmasq-dns-6f67486d89-6nn4q\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.981190 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r5pdv-config-sc2tr"] Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.982453 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.984467 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 13 18:30:24 crc kubenswrapper[4974]: I1013 18:30:24.998963 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r5pdv-config-sc2tr"] Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.069845 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.146828 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-run\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.147161 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-log-ovn\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.147194 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/246898f0-6dfa-4868-876f-99eccaae27a4-additional-scripts\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.147343 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-run-ovn\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.147409 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9ll5\" (UniqueName: \"kubernetes.io/projected/246898f0-6dfa-4868-876f-99eccaae27a4-kube-api-access-g9ll5\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.147662 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/246898f0-6dfa-4868-876f-99eccaae27a4-scripts\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.251379 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/246898f0-6dfa-4868-876f-99eccaae27a4-scripts\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.251430 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-run\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.251465 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-log-ovn\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.251491 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/246898f0-6dfa-4868-876f-99eccaae27a4-additional-scripts\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.251546 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-run-ovn\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.251578 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9ll5\" (UniqueName: \"kubernetes.io/projected/246898f0-6dfa-4868-876f-99eccaae27a4-kube-api-access-g9ll5\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.251870 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-run\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.251870 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-run-ovn\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.251951 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-log-ovn\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.253172 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/246898f0-6dfa-4868-876f-99eccaae27a4-additional-scripts\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.253975 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/246898f0-6dfa-4868-876f-99eccaae27a4-scripts\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.287059 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9ll5\" (UniqueName: \"kubernetes.io/projected/246898f0-6dfa-4868-876f-99eccaae27a4-kube-api-access-g9ll5\") pod \"ovn-controller-r5pdv-config-sc2tr\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.332152 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.565235 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f67486d89-6nn4q"] Oct 13 18:30:25 crc kubenswrapper[4974]: W1013 18:30:25.802744 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod246898f0_6dfa_4868_876f_99eccaae27a4.slice/crio-530c9f6b46dc0252633b959aab87569c143c97fa0f49d486949e861e487b2bf4 WatchSource:0}: Error finding container 530c9f6b46dc0252633b959aab87569c143c97fa0f49d486949e861e487b2bf4: Status 404 returned error can't find the container with id 530c9f6b46dc0252633b959aab87569c143c97fa0f49d486949e861e487b2bf4 Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.809202 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r5pdv-config-sc2tr"] Oct 13 18:30:25 crc kubenswrapper[4974]: I1013 18:30:25.828729 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1329160a-6a42-45dd-a36b-1b35e28fdc4c" path="/var/lib/kubelet/pods/1329160a-6a42-45dd-a36b-1b35e28fdc4c/volumes" Oct 13 18:30:26 crc kubenswrapper[4974]: I1013 18:30:26.392332 4974 generic.go:334] "Generic (PLEG): container finished" podID="7a76ef05-7c82-4ef0-81c8-83fbef1d3496" containerID="94333711a6bedf7157f1d6d764a66a8412281b8ae658b77f6493ba3ff78f8636" exitCode=0 Oct 13 18:30:26 crc kubenswrapper[4974]: I1013 18:30:26.392486 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" event={"ID":"7a76ef05-7c82-4ef0-81c8-83fbef1d3496","Type":"ContainerDied","Data":"94333711a6bedf7157f1d6d764a66a8412281b8ae658b77f6493ba3ff78f8636"} Oct 13 18:30:26 crc kubenswrapper[4974]: I1013 18:30:26.392645 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" event={"ID":"7a76ef05-7c82-4ef0-81c8-83fbef1d3496","Type":"ContainerStarted","Data":"58b0ee964d0f1531c693f956febc25435a97ed38703ed4c73938b00805af201c"} Oct 13 18:30:26 crc kubenswrapper[4974]: I1013 18:30:26.397129 4974 generic.go:334] "Generic (PLEG): container finished" podID="246898f0-6dfa-4868-876f-99eccaae27a4" containerID="7ed3a4b63164c74e011c06c8a38b884cbac99cd9f55e21363135f25bcacd086a" exitCode=0 Oct 13 18:30:26 crc kubenswrapper[4974]: I1013 18:30:26.397160 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-sc2tr" event={"ID":"246898f0-6dfa-4868-876f-99eccaae27a4","Type":"ContainerDied","Data":"7ed3a4b63164c74e011c06c8a38b884cbac99cd9f55e21363135f25bcacd086a"} Oct 13 18:30:26 crc kubenswrapper[4974]: I1013 18:30:26.397201 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-sc2tr" event={"ID":"246898f0-6dfa-4868-876f-99eccaae27a4","Type":"ContainerStarted","Data":"530c9f6b46dc0252633b959aab87569c143c97fa0f49d486949e861e487b2bf4"} Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.411937 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" event={"ID":"7a76ef05-7c82-4ef0-81c8-83fbef1d3496","Type":"ContainerStarted","Data":"c07a1d964e61f6c66cf31897362b4e3ede991f8a0095ba18954f3162cd3a5e9b"} Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.412025 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.450763 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" podStartSLOduration=3.450738551 podStartE2EDuration="3.450738551s" podCreationTimestamp="2025-10-13 18:30:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:30:27.43760512 +0000 UTC m=+962.341971210" watchObservedRunningTime="2025-10-13 18:30:27.450738551 +0000 UTC m=+962.355104651" Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.794756 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.904186 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/246898f0-6dfa-4868-876f-99eccaae27a4-additional-scripts\") pod \"246898f0-6dfa-4868-876f-99eccaae27a4\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.904234 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/246898f0-6dfa-4868-876f-99eccaae27a4-scripts\") pod \"246898f0-6dfa-4868-876f-99eccaae27a4\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.904264 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-run\") pod \"246898f0-6dfa-4868-876f-99eccaae27a4\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.904303 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9ll5\" (UniqueName: \"kubernetes.io/projected/246898f0-6dfa-4868-876f-99eccaae27a4-kube-api-access-g9ll5\") pod \"246898f0-6dfa-4868-876f-99eccaae27a4\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.904394 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-run-ovn\") pod \"246898f0-6dfa-4868-876f-99eccaae27a4\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.904411 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-log-ovn\") pod \"246898f0-6dfa-4868-876f-99eccaae27a4\" (UID: \"246898f0-6dfa-4868-876f-99eccaae27a4\") " Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.904806 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "246898f0-6dfa-4868-876f-99eccaae27a4" (UID: "246898f0-6dfa-4868-876f-99eccaae27a4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.904943 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-run" (OuterVolumeSpecName: "var-run") pod "246898f0-6dfa-4868-876f-99eccaae27a4" (UID: "246898f0-6dfa-4868-876f-99eccaae27a4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.905078 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "246898f0-6dfa-4868-876f-99eccaae27a4" (UID: "246898f0-6dfa-4868-876f-99eccaae27a4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.905400 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/246898f0-6dfa-4868-876f-99eccaae27a4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "246898f0-6dfa-4868-876f-99eccaae27a4" (UID: "246898f0-6dfa-4868-876f-99eccaae27a4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.906378 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/246898f0-6dfa-4868-876f-99eccaae27a4-scripts" (OuterVolumeSpecName: "scripts") pod "246898f0-6dfa-4868-876f-99eccaae27a4" (UID: "246898f0-6dfa-4868-876f-99eccaae27a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:27 crc kubenswrapper[4974]: I1013 18:30:27.909223 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/246898f0-6dfa-4868-876f-99eccaae27a4-kube-api-access-g9ll5" (OuterVolumeSpecName: "kube-api-access-g9ll5") pod "246898f0-6dfa-4868-876f-99eccaae27a4" (UID: "246898f0-6dfa-4868-876f-99eccaae27a4"). InnerVolumeSpecName "kube-api-access-g9ll5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.007508 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/246898f0-6dfa-4868-876f-99eccaae27a4-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.008212 4974 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-run\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.008232 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9ll5\" (UniqueName: \"kubernetes.io/projected/246898f0-6dfa-4868-876f-99eccaae27a4-kube-api-access-g9ll5\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.008268 4974 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.008286 4974 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/246898f0-6dfa-4868-876f-99eccaae27a4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.008298 4974 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/246898f0-6dfa-4868-876f-99eccaae27a4-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.424016 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-sc2tr" event={"ID":"246898f0-6dfa-4868-876f-99eccaae27a4","Type":"ContainerDied","Data":"530c9f6b46dc0252633b959aab87569c143c97fa0f49d486949e861e487b2bf4"} Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.424369 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="530c9f6b46dc0252633b959aab87569c143c97fa0f49d486949e861e487b2bf4" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.424035 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-sc2tr" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.893355 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-r5pdv-config-sc2tr"] Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.903011 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-r5pdv-config-sc2tr"] Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.930960 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r5pdv-config-zchl6"] Oct 13 18:30:28 crc kubenswrapper[4974]: E1013 18:30:28.931482 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="246898f0-6dfa-4868-876f-99eccaae27a4" containerName="ovn-config" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.931509 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="246898f0-6dfa-4868-876f-99eccaae27a4" containerName="ovn-config" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.931828 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="246898f0-6dfa-4868-876f-99eccaae27a4" containerName="ovn-config" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.933281 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.937432 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 13 18:30:28 crc kubenswrapper[4974]: I1013 18:30:28.942284 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r5pdv-config-zchl6"] Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.025303 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc6gg\" (UniqueName: \"kubernetes.io/projected/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-kube-api-access-cc6gg\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.025368 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-log-ovn\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.025393 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-run\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.025481 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-run-ovn\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.025526 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-additional-scripts\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.025555 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-scripts\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.127072 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-additional-scripts\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.127152 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-scripts\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.127286 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc6gg\" (UniqueName: \"kubernetes.io/projected/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-kube-api-access-cc6gg\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.127348 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-log-ovn\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.127384 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-run\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.127434 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-run-ovn\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.127714 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-run-ovn\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.127694 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-log-ovn\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.127756 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-run\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.127782 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-additional-scripts\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.130805 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-scripts\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.160139 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc6gg\" (UniqueName: \"kubernetes.io/projected/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-kube-api-access-cc6gg\") pod \"ovn-controller-r5pdv-config-zchl6\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.252792 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.727744 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r5pdv-config-zchl6"] Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.797608 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Oct 13 18:30:29 crc kubenswrapper[4974]: I1013 18:30:29.830004 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="246898f0-6dfa-4868-876f-99eccaae27a4" path="/var/lib/kubelet/pods/246898f0-6dfa-4868-876f-99eccaae27a4/volumes" Oct 13 18:30:30 crc kubenswrapper[4974]: I1013 18:30:30.135017 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="a3edaa1a-d213-473f-963a-3bfea41226ec" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Oct 13 18:30:30 crc kubenswrapper[4974]: I1013 18:30:30.431738 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="97f813f5-f34c-4b82-b066-032f8b795049" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.108:5671: connect: connection refused" Oct 13 18:30:30 crc kubenswrapper[4974]: I1013 18:30:30.448435 4974 generic.go:334] "Generic (PLEG): container finished" podID="f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21" containerID="ca65f504f908623cde173873fab57be00d92d9ce2b70f357fc3564a84a2d51a8" exitCode=0 Oct 13 18:30:30 crc kubenswrapper[4974]: I1013 18:30:30.448488 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-zchl6" event={"ID":"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21","Type":"ContainerDied","Data":"ca65f504f908623cde173873fab57be00d92d9ce2b70f357fc3564a84a2d51a8"} Oct 13 18:30:30 crc kubenswrapper[4974]: I1013 18:30:30.448525 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-zchl6" event={"ID":"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21","Type":"ContainerStarted","Data":"7f0dcd79044012f702564708e478651caca7411b477cd7d2de51d827d8b51b60"} Oct 13 18:30:31 crc kubenswrapper[4974]: E1013 18:30:31.045203 4974 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f3cf41f_a929_4f3c_9063_682924915904.slice/crio-6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f3cf41f_a929_4f3c_9063_682924915904.slice/crio-conmon-6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd.scope\": RecentStats: unable to find data in memory cache]" Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.460485 4974 generic.go:334] "Generic (PLEG): container finished" podID="7f3cf41f-a929-4f3c-9063-682924915904" containerID="6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd" exitCode=0 Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.460638 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7f3cf41f-a929-4f3c-9063-682924915904","Type":"ContainerDied","Data":"6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd"} Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.833478 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.977916 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-log-ovn\") pod \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.978146 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21" (UID: "f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.978430 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-run\") pod \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.978495 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc6gg\" (UniqueName: \"kubernetes.io/projected/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-kube-api-access-cc6gg\") pod \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.978517 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-run" (OuterVolumeSpecName: "var-run") pod "f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21" (UID: "f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.978582 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-run-ovn\") pod \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.978636 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-scripts\") pod \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.978694 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-additional-scripts\") pod \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\" (UID: \"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21\") " Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.978624 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21" (UID: "f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.979218 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21" (UID: "f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.979326 4974 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-run\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.979372 4974 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.979384 4974 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.979399 4974 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.979842 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-scripts" (OuterVolumeSpecName: "scripts") pod "f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21" (UID: "f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:31 crc kubenswrapper[4974]: I1013 18:30:31.982192 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-kube-api-access-cc6gg" (OuterVolumeSpecName: "kube-api-access-cc6gg") pod "f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21" (UID: "f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21"). InnerVolumeSpecName "kube-api-access-cc6gg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.080578 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc6gg\" (UniqueName: \"kubernetes.io/projected/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-kube-api-access-cc6gg\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.080609 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.470838 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-zchl6" event={"ID":"f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21","Type":"ContainerDied","Data":"7f0dcd79044012f702564708e478651caca7411b477cd7d2de51d827d8b51b60"} Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.470885 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f0dcd79044012f702564708e478651caca7411b477cd7d2de51d827d8b51b60" Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.470892 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-zchl6" Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.473379 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7f3cf41f-a929-4f3c-9063-682924915904","Type":"ContainerStarted","Data":"8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6"} Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.917781 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-r5pdv-config-zchl6"] Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.923823 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-r5pdv-config-zchl6"] Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.964288 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r5pdv-config-2mnc7"] Oct 13 18:30:32 crc kubenswrapper[4974]: E1013 18:30:32.964769 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21" containerName="ovn-config" Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.964792 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21" containerName="ovn-config" Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.965006 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21" containerName="ovn-config" Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.965922 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.967742 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 13 18:30:32 crc kubenswrapper[4974]: I1013 18:30:32.987945 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r5pdv-config-2mnc7"] Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.095787 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqpsl\" (UniqueName: \"kubernetes.io/projected/242b50f0-a032-49cc-b631-726a49672b27-kube-api-access-xqpsl\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.096035 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/242b50f0-a032-49cc-b631-726a49672b27-additional-scripts\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.096227 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/242b50f0-a032-49cc-b631-726a49672b27-scripts\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.096384 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-run-ovn\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.096477 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-run\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.096562 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-log-ovn\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.198123 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/242b50f0-a032-49cc-b631-726a49672b27-additional-scripts\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.198184 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/242b50f0-a032-49cc-b631-726a49672b27-scripts\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.198227 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-run-ovn\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.198262 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-run\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.198300 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-log-ovn\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.198372 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqpsl\" (UniqueName: \"kubernetes.io/projected/242b50f0-a032-49cc-b631-726a49672b27-kube-api-access-xqpsl\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.199460 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/242b50f0-a032-49cc-b631-726a49672b27-additional-scripts\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.200050 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-run\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.200070 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-run-ovn\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.200127 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-log-ovn\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.201725 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/242b50f0-a032-49cc-b631-726a49672b27-scripts\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.223868 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqpsl\" (UniqueName: \"kubernetes.io/projected/242b50f0-a032-49cc-b631-726a49672b27-kube-api-access-xqpsl\") pod \"ovn-controller-r5pdv-config-2mnc7\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.282859 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.561074 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r5pdv-config-2mnc7"] Oct 13 18:30:33 crc kubenswrapper[4974]: W1013 18:30:33.564889 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod242b50f0_a032_49cc_b631_726a49672b27.slice/crio-1a65313a07da44ead636efa472b63cd00974a88f822587dfdc58cc52067be9e5 WatchSource:0}: Error finding container 1a65313a07da44ead636efa472b63cd00974a88f822587dfdc58cc52067be9e5: Status 404 returned error can't find the container with id 1a65313a07da44ead636efa472b63cd00974a88f822587dfdc58cc52067be9e5 Oct 13 18:30:33 crc kubenswrapper[4974]: I1013 18:30:33.825315 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21" path="/var/lib/kubelet/pods/f6ef5dd4-4e81-4fb8-bac0-a97fa14d9a21/volumes" Oct 13 18:30:34 crc kubenswrapper[4974]: I1013 18:30:34.492127 4974 generic.go:334] "Generic (PLEG): container finished" podID="242b50f0-a032-49cc-b631-726a49672b27" containerID="5544ce163a08127e1c9ef75f7b429c575d7a7198e2bf33873fa4cb6fc3f3a050" exitCode=0 Oct 13 18:30:34 crc kubenswrapper[4974]: I1013 18:30:34.492202 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-2mnc7" event={"ID":"242b50f0-a032-49cc-b631-726a49672b27","Type":"ContainerDied","Data":"5544ce163a08127e1c9ef75f7b429c575d7a7198e2bf33873fa4cb6fc3f3a050"} Oct 13 18:30:34 crc kubenswrapper[4974]: I1013 18:30:34.492604 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-2mnc7" event={"ID":"242b50f0-a032-49cc-b631-726a49672b27","Type":"ContainerStarted","Data":"1a65313a07da44ead636efa472b63cd00974a88f822587dfdc58cc52067be9e5"} Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.072362 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.194300 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65cff4f877-2mckm"] Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.194812 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" podUID="37628139-a9e5-4396-a102-b1cc1de2fd2e" containerName="dnsmasq-dns" containerID="cri-o://13f558a91965142fefcd5b7de36b3049dbe860968e2e34d0ef1395f653ac60dc" gracePeriod=10 Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.501078 4974 generic.go:334] "Generic (PLEG): container finished" podID="37628139-a9e5-4396-a102-b1cc1de2fd2e" containerID="13f558a91965142fefcd5b7de36b3049dbe860968e2e34d0ef1395f653ac60dc" exitCode=0 Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.501138 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" event={"ID":"37628139-a9e5-4396-a102-b1cc1de2fd2e","Type":"ContainerDied","Data":"13f558a91965142fefcd5b7de36b3049dbe860968e2e34d0ef1395f653ac60dc"} Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.503725 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7f3cf41f-a929-4f3c-9063-682924915904","Type":"ContainerStarted","Data":"0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303"} Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.503742 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7f3cf41f-a929-4f3c-9063-682924915904","Type":"ContainerStarted","Data":"30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0"} Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.529641 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=16.529624829 podStartE2EDuration="16.529624829s" podCreationTimestamp="2025-10-13 18:30:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:30:35.523455465 +0000 UTC m=+970.427821555" watchObservedRunningTime="2025-10-13 18:30:35.529624829 +0000 UTC m=+970.433990909" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.676387 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.844363 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-dns-svc\") pod \"37628139-a9e5-4396-a102-b1cc1de2fd2e\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.844670 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5spjq\" (UniqueName: \"kubernetes.io/projected/37628139-a9e5-4396-a102-b1cc1de2fd2e-kube-api-access-5spjq\") pod \"37628139-a9e5-4396-a102-b1cc1de2fd2e\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.844706 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-config\") pod \"37628139-a9e5-4396-a102-b1cc1de2fd2e\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.844757 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-ovsdbserver-nb\") pod \"37628139-a9e5-4396-a102-b1cc1de2fd2e\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.844835 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-ovsdbserver-sb\") pod \"37628139-a9e5-4396-a102-b1cc1de2fd2e\" (UID: \"37628139-a9e5-4396-a102-b1cc1de2fd2e\") " Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.861208 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37628139-a9e5-4396-a102-b1cc1de2fd2e-kube-api-access-5spjq" (OuterVolumeSpecName: "kube-api-access-5spjq") pod "37628139-a9e5-4396-a102-b1cc1de2fd2e" (UID: "37628139-a9e5-4396-a102-b1cc1de2fd2e"). InnerVolumeSpecName "kube-api-access-5spjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.897558 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37628139-a9e5-4396-a102-b1cc1de2fd2e" (UID: "37628139-a9e5-4396-a102-b1cc1de2fd2e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.901029 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37628139-a9e5-4396-a102-b1cc1de2fd2e" (UID: "37628139-a9e5-4396-a102-b1cc1de2fd2e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.904782 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37628139-a9e5-4396-a102-b1cc1de2fd2e" (UID: "37628139-a9e5-4396-a102-b1cc1de2fd2e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.907872 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-config" (OuterVolumeSpecName: "config") pod "37628139-a9e5-4396-a102-b1cc1de2fd2e" (UID: "37628139-a9e5-4396-a102-b1cc1de2fd2e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.947151 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.947183 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5spjq\" (UniqueName: \"kubernetes.io/projected/37628139-a9e5-4396-a102-b1cc1de2fd2e-kube-api-access-5spjq\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.947255 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.947268 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.947276 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37628139-a9e5-4396-a102-b1cc1de2fd2e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:35 crc kubenswrapper[4974]: I1013 18:30:35.956542 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.047983 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-run\") pod \"242b50f0-a032-49cc-b631-726a49672b27\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.048169 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqpsl\" (UniqueName: \"kubernetes.io/projected/242b50f0-a032-49cc-b631-726a49672b27-kube-api-access-xqpsl\") pod \"242b50f0-a032-49cc-b631-726a49672b27\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.048257 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-run-ovn\") pod \"242b50f0-a032-49cc-b631-726a49672b27\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.048296 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-log-ovn\") pod \"242b50f0-a032-49cc-b631-726a49672b27\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.048335 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/242b50f0-a032-49cc-b631-726a49672b27-scripts\") pod \"242b50f0-a032-49cc-b631-726a49672b27\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.048361 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/242b50f0-a032-49cc-b631-726a49672b27-additional-scripts\") pod \"242b50f0-a032-49cc-b631-726a49672b27\" (UID: \"242b50f0-a032-49cc-b631-726a49672b27\") " Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.048563 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "242b50f0-a032-49cc-b631-726a49672b27" (UID: "242b50f0-a032-49cc-b631-726a49672b27"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.048595 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "242b50f0-a032-49cc-b631-726a49672b27" (UID: "242b50f0-a032-49cc-b631-726a49672b27"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.049395 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/242b50f0-a032-49cc-b631-726a49672b27-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "242b50f0-a032-49cc-b631-726a49672b27" (UID: "242b50f0-a032-49cc-b631-726a49672b27"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.049922 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/242b50f0-a032-49cc-b631-726a49672b27-scripts" (OuterVolumeSpecName: "scripts") pod "242b50f0-a032-49cc-b631-726a49672b27" (UID: "242b50f0-a032-49cc-b631-726a49672b27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.050001 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-run" (OuterVolumeSpecName: "var-run") pod "242b50f0-a032-49cc-b631-726a49672b27" (UID: "242b50f0-a032-49cc-b631-726a49672b27"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.052415 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/242b50f0-a032-49cc-b631-726a49672b27-kube-api-access-xqpsl" (OuterVolumeSpecName: "kube-api-access-xqpsl") pod "242b50f0-a032-49cc-b631-726a49672b27" (UID: "242b50f0-a032-49cc-b631-726a49672b27"). InnerVolumeSpecName "kube-api-access-xqpsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.150719 4974 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.150756 4974 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.150767 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/242b50f0-a032-49cc-b631-726a49672b27-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.150776 4974 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/242b50f0-a032-49cc-b631-726a49672b27-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.150786 4974 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/242b50f0-a032-49cc-b631-726a49672b27-var-run\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.150794 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqpsl\" (UniqueName: \"kubernetes.io/projected/242b50f0-a032-49cc-b631-726a49672b27-kube-api-access-xqpsl\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.512138 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-2mnc7" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.513037 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-2mnc7" event={"ID":"242b50f0-a032-49cc-b631-726a49672b27","Type":"ContainerDied","Data":"1a65313a07da44ead636efa472b63cd00974a88f822587dfdc58cc52067be9e5"} Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.513093 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a65313a07da44ead636efa472b63cd00974a88f822587dfdc58cc52067be9e5" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.515783 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.515830 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-65cff4f877-2mckm" event={"ID":"37628139-a9e5-4396-a102-b1cc1de2fd2e","Type":"ContainerDied","Data":"f820462a93b0f6aae1c61e759d41d2b18815e8aba4a9940827524d75cc9e7182"} Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.515942 4974 scope.go:117] "RemoveContainer" containerID="13f558a91965142fefcd5b7de36b3049dbe860968e2e34d0ef1395f653ac60dc" Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.545524 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-65cff4f877-2mckm"] Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.552152 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-65cff4f877-2mckm"] Oct 13 18:30:36 crc kubenswrapper[4974]: I1013 18:30:36.552355 4974 scope.go:117] "RemoveContainer" containerID="cdbdf537d5f0b6dc689516cf8aa79a990c66b7a5f1ba685b8b169f31c269fcc3" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.043577 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-r5pdv-config-2mnc7"] Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.052538 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-r5pdv-config-2mnc7"] Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.169960 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-r5pdv-config-dmjp5"] Oct 13 18:30:37 crc kubenswrapper[4974]: E1013 18:30:37.170523 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37628139-a9e5-4396-a102-b1cc1de2fd2e" containerName="dnsmasq-dns" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.170539 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="37628139-a9e5-4396-a102-b1cc1de2fd2e" containerName="dnsmasq-dns" Oct 13 18:30:37 crc kubenswrapper[4974]: E1013 18:30:37.170585 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="242b50f0-a032-49cc-b631-726a49672b27" containerName="ovn-config" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.170596 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="242b50f0-a032-49cc-b631-726a49672b27" containerName="ovn-config" Oct 13 18:30:37 crc kubenswrapper[4974]: E1013 18:30:37.170611 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37628139-a9e5-4396-a102-b1cc1de2fd2e" containerName="init" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.170619 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="37628139-a9e5-4396-a102-b1cc1de2fd2e" containerName="init" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.170838 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="242b50f0-a032-49cc-b631-726a49672b27" containerName="ovn-config" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.170890 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="37628139-a9e5-4396-a102-b1cc1de2fd2e" containerName="dnsmasq-dns" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.171537 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.176251 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.180062 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r5pdv-config-dmjp5"] Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.269933 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60cef625-362c-45d3-a894-e0ee8840da56-scripts\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.270046 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-run\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.270100 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60cef625-362c-45d3-a894-e0ee8840da56-additional-scripts\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.270224 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvlhc\" (UniqueName: \"kubernetes.io/projected/60cef625-362c-45d3-a894-e0ee8840da56-kube-api-access-zvlhc\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.270480 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-log-ovn\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.270544 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-run-ovn\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.371791 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60cef625-362c-45d3-a894-e0ee8840da56-scripts\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.371849 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-run\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.371881 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60cef625-362c-45d3-a894-e0ee8840da56-additional-scripts\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.371908 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvlhc\" (UniqueName: \"kubernetes.io/projected/60cef625-362c-45d3-a894-e0ee8840da56-kube-api-access-zvlhc\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.371953 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-log-ovn\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.371976 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-run-ovn\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.372300 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-run-ovn\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.372921 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-run\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.373000 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-log-ovn\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.374226 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60cef625-362c-45d3-a894-e0ee8840da56-additional-scripts\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.374266 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60cef625-362c-45d3-a894-e0ee8840da56-scripts\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.391425 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvlhc\" (UniqueName: \"kubernetes.io/projected/60cef625-362c-45d3-a894-e0ee8840da56-kube-api-access-zvlhc\") pod \"ovn-controller-r5pdv-config-dmjp5\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.493812 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.841618 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="242b50f0-a032-49cc-b631-726a49672b27" path="/var/lib/kubelet/pods/242b50f0-a032-49cc-b631-726a49672b27/volumes" Oct 13 18:30:37 crc kubenswrapper[4974]: I1013 18:30:37.842686 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37628139-a9e5-4396-a102-b1cc1de2fd2e" path="/var/lib/kubelet/pods/37628139-a9e5-4396-a102-b1cc1de2fd2e/volumes" Oct 13 18:30:38 crc kubenswrapper[4974]: I1013 18:30:38.012070 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-r5pdv-config-dmjp5"] Oct 13 18:30:38 crc kubenswrapper[4974]: I1013 18:30:38.537888 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-dmjp5" event={"ID":"60cef625-362c-45d3-a894-e0ee8840da56","Type":"ContainerStarted","Data":"e13e34ba0f4e9f55479f04a643e78140f9612282cf82aee579f4ccd1b24af1eb"} Oct 13 18:30:38 crc kubenswrapper[4974]: I1013 18:30:38.538171 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-dmjp5" event={"ID":"60cef625-362c-45d3-a894-e0ee8840da56","Type":"ContainerStarted","Data":"0184f145b15432f5cc33c5231531d9f60038ac0280c0043f59e75d9b66479d9e"} Oct 13 18:30:38 crc kubenswrapper[4974]: I1013 18:30:38.561640 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-r5pdv-config-dmjp5" podStartSLOduration=1.561623365 podStartE2EDuration="1.561623365s" podCreationTimestamp="2025-10-13 18:30:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:30:38.5557998 +0000 UTC m=+973.460165890" watchObservedRunningTime="2025-10-13 18:30:38.561623365 +0000 UTC m=+973.465989445" Oct 13 18:30:39 crc kubenswrapper[4974]: I1013 18:30:39.552292 4974 generic.go:334] "Generic (PLEG): container finished" podID="60cef625-362c-45d3-a894-e0ee8840da56" containerID="e13e34ba0f4e9f55479f04a643e78140f9612282cf82aee579f4ccd1b24af1eb" exitCode=0 Oct 13 18:30:39 crc kubenswrapper[4974]: I1013 18:30:39.552993 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-r5pdv-config-dmjp5" event={"ID":"60cef625-362c-45d3-a894-e0ee8840da56","Type":"ContainerDied","Data":"e13e34ba0f4e9f55479f04a643e78140f9612282cf82aee579f4ccd1b24af1eb"} Oct 13 18:30:39 crc kubenswrapper[4974]: I1013 18:30:39.676818 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:39 crc kubenswrapper[4974]: I1013 18:30:39.796705 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:30:40 crc kubenswrapper[4974]: I1013 18:30:40.132364 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Oct 13 18:30:40 crc kubenswrapper[4974]: I1013 18:30:40.430427 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 13 18:30:40 crc kubenswrapper[4974]: I1013 18:30:40.875420 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.034199 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-run\") pod \"60cef625-362c-45d3-a894-e0ee8840da56\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.034290 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60cef625-362c-45d3-a894-e0ee8840da56-additional-scripts\") pod \"60cef625-362c-45d3-a894-e0ee8840da56\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.034329 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-run" (OuterVolumeSpecName: "var-run") pod "60cef625-362c-45d3-a894-e0ee8840da56" (UID: "60cef625-362c-45d3-a894-e0ee8840da56"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.034397 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-run-ovn\") pod \"60cef625-362c-45d3-a894-e0ee8840da56\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.034505 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-log-ovn\") pod \"60cef625-362c-45d3-a894-e0ee8840da56\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.034492 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "60cef625-362c-45d3-a894-e0ee8840da56" (UID: "60cef625-362c-45d3-a894-e0ee8840da56"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.034559 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60cef625-362c-45d3-a894-e0ee8840da56-scripts\") pod \"60cef625-362c-45d3-a894-e0ee8840da56\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.034581 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "60cef625-362c-45d3-a894-e0ee8840da56" (UID: "60cef625-362c-45d3-a894-e0ee8840da56"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.035304 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cef625-362c-45d3-a894-e0ee8840da56-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "60cef625-362c-45d3-a894-e0ee8840da56" (UID: "60cef625-362c-45d3-a894-e0ee8840da56"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.036062 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cef625-362c-45d3-a894-e0ee8840da56-scripts" (OuterVolumeSpecName: "scripts") pod "60cef625-362c-45d3-a894-e0ee8840da56" (UID: "60cef625-362c-45d3-a894-e0ee8840da56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.036080 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvlhc\" (UniqueName: \"kubernetes.io/projected/60cef625-362c-45d3-a894-e0ee8840da56-kube-api-access-zvlhc\") pod \"60cef625-362c-45d3-a894-e0ee8840da56\" (UID: \"60cef625-362c-45d3-a894-e0ee8840da56\") " Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.037084 4974 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-log-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.037120 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60cef625-362c-45d3-a894-e0ee8840da56-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.037139 4974 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-run\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.037161 4974 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60cef625-362c-45d3-a894-e0ee8840da56-additional-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.037182 4974 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60cef625-362c-45d3-a894-e0ee8840da56-var-run-ovn\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.042588 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60cef625-362c-45d3-a894-e0ee8840da56-kube-api-access-zvlhc" (OuterVolumeSpecName: "kube-api-access-zvlhc") pod "60cef625-362c-45d3-a894-e0ee8840da56" (UID: "60cef625-362c-45d3-a894-e0ee8840da56"). InnerVolumeSpecName "kube-api-access-zvlhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.115086 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-r5pdv-config-dmjp5"] Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.123187 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-r5pdv-config-dmjp5"] Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.138236 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvlhc\" (UniqueName: \"kubernetes.io/projected/60cef625-362c-45d3-a894-e0ee8840da56-kube-api-access-zvlhc\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.219373 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-7mkgf"] Oct 13 18:30:41 crc kubenswrapper[4974]: E1013 18:30:41.219820 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60cef625-362c-45d3-a894-e0ee8840da56" containerName="ovn-config" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.219840 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="60cef625-362c-45d3-a894-e0ee8840da56" containerName="ovn-config" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.220079 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="60cef625-362c-45d3-a894-e0ee8840da56" containerName="ovn-config" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.220764 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7mkgf" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.241451 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7mkgf"] Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.351301 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw2zv\" (UniqueName: \"kubernetes.io/projected/7b802023-9db0-4145-8872-6fb2fb3f26e3-kube-api-access-pw2zv\") pod \"glance-db-create-7mkgf\" (UID: \"7b802023-9db0-4145-8872-6fb2fb3f26e3\") " pod="openstack/glance-db-create-7mkgf" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.453460 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw2zv\" (UniqueName: \"kubernetes.io/projected/7b802023-9db0-4145-8872-6fb2fb3f26e3-kube-api-access-pw2zv\") pod \"glance-db-create-7mkgf\" (UID: \"7b802023-9db0-4145-8872-6fb2fb3f26e3\") " pod="openstack/glance-db-create-7mkgf" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.471371 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw2zv\" (UniqueName: \"kubernetes.io/projected/7b802023-9db0-4145-8872-6fb2fb3f26e3-kube-api-access-pw2zv\") pod \"glance-db-create-7mkgf\" (UID: \"7b802023-9db0-4145-8872-6fb2fb3f26e3\") " pod="openstack/glance-db-create-7mkgf" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.546307 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7mkgf" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.571957 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0184f145b15432f5cc33c5231531d9f60038ac0280c0043f59e75d9b66479d9e" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.572026 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-r5pdv-config-dmjp5" Oct 13 18:30:41 crc kubenswrapper[4974]: I1013 18:30:41.825886 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60cef625-362c-45d3-a894-e0ee8840da56" path="/var/lib/kubelet/pods/60cef625-362c-45d3-a894-e0ee8840da56/volumes" Oct 13 18:30:42 crc kubenswrapper[4974]: I1013 18:30:42.026401 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-7mkgf"] Oct 13 18:30:42 crc kubenswrapper[4974]: I1013 18:30:42.588838 4974 generic.go:334] "Generic (PLEG): container finished" podID="7b802023-9db0-4145-8872-6fb2fb3f26e3" containerID="a23f544c94efd54b1575365a3a1b38c4f701ae96577e618eaf603f909d5a4a23" exitCode=0 Oct 13 18:30:42 crc kubenswrapper[4974]: I1013 18:30:42.589664 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7mkgf" event={"ID":"7b802023-9db0-4145-8872-6fb2fb3f26e3","Type":"ContainerDied","Data":"a23f544c94efd54b1575365a3a1b38c4f701ae96577e618eaf603f909d5a4a23"} Oct 13 18:30:42 crc kubenswrapper[4974]: I1013 18:30:42.589694 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7mkgf" event={"ID":"7b802023-9db0-4145-8872-6fb2fb3f26e3","Type":"ContainerStarted","Data":"3903548cb49abfc3324a8b6c1eaa65cf64a6ce493b4260fa01cad5a61d77a00f"} Oct 13 18:30:42 crc kubenswrapper[4974]: I1013 18:30:42.838966 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-r947d"] Oct 13 18:30:42 crc kubenswrapper[4974]: I1013 18:30:42.842347 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r947d" Oct 13 18:30:42 crc kubenswrapper[4974]: I1013 18:30:42.850165 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-r947d"] Oct 13 18:30:42 crc kubenswrapper[4974]: I1013 18:30:42.877926 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4w78\" (UniqueName: \"kubernetes.io/projected/723dbb39-0ec4-4193-8fb9-307d311f962e-kube-api-access-d4w78\") pod \"barbican-db-create-r947d\" (UID: \"723dbb39-0ec4-4193-8fb9-307d311f962e\") " pod="openstack/barbican-db-create-r947d" Oct 13 18:30:42 crc kubenswrapper[4974]: I1013 18:30:42.979099 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4w78\" (UniqueName: \"kubernetes.io/projected/723dbb39-0ec4-4193-8fb9-307d311f962e-kube-api-access-d4w78\") pod \"barbican-db-create-r947d\" (UID: \"723dbb39-0ec4-4193-8fb9-307d311f962e\") " pod="openstack/barbican-db-create-r947d" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.008811 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4w78\" (UniqueName: \"kubernetes.io/projected/723dbb39-0ec4-4193-8fb9-307d311f962e-kube-api-access-d4w78\") pod \"barbican-db-create-r947d\" (UID: \"723dbb39-0ec4-4193-8fb9-307d311f962e\") " pod="openstack/barbican-db-create-r947d" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.027787 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-pwgg6"] Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.029077 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pwgg6" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.076919 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-pwgg6"] Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.138068 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-f52xm"] Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.139090 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.143926 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-dv7bj" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.146432 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-f52xm"] Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.157379 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r947d" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.162637 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.181798 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc98p\" (UniqueName: \"kubernetes.io/projected/839f3919-be07-4c53-9d6d-92e2ad6d0059-kube-api-access-cc98p\") pod \"cinder-db-create-pwgg6\" (UID: \"839f3919-be07-4c53-9d6d-92e2ad6d0059\") " pod="openstack/cinder-db-create-pwgg6" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.197822 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zqz9v"] Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.204521 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zqz9v" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.212981 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zqz9v"] Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.283287 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc98p\" (UniqueName: \"kubernetes.io/projected/839f3919-be07-4c53-9d6d-92e2ad6d0059-kube-api-access-cc98p\") pod \"cinder-db-create-pwgg6\" (UID: \"839f3919-be07-4c53-9d6d-92e2ad6d0059\") " pod="openstack/cinder-db-create-pwgg6" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.283395 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc4lz\" (UniqueName: \"kubernetes.io/projected/84eeba03-fc43-4027-8528-dc161a147dfb-kube-api-access-nc4lz\") pod \"watcher-db-sync-f52xm\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.283423 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-combined-ca-bundle\") pod \"watcher-db-sync-f52xm\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.283450 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-config-data\") pod \"watcher-db-sync-f52xm\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.283481 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-db-sync-config-data\") pod \"watcher-db-sync-f52xm\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.303885 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc98p\" (UniqueName: \"kubernetes.io/projected/839f3919-be07-4c53-9d6d-92e2ad6d0059-kube-api-access-cc98p\") pod \"cinder-db-create-pwgg6\" (UID: \"839f3919-be07-4c53-9d6d-92e2ad6d0059\") " pod="openstack/cinder-db-create-pwgg6" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.350035 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pwgg6" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.385702 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc4lz\" (UniqueName: \"kubernetes.io/projected/84eeba03-fc43-4027-8528-dc161a147dfb-kube-api-access-nc4lz\") pod \"watcher-db-sync-f52xm\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.385761 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-combined-ca-bundle\") pod \"watcher-db-sync-f52xm\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.385792 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-config-data\") pod \"watcher-db-sync-f52xm\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.385821 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-db-sync-config-data\") pod \"watcher-db-sync-f52xm\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.385862 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zblvq\" (UniqueName: \"kubernetes.io/projected/fff4b736-2214-4791-b0c5-6d909c396d53-kube-api-access-zblvq\") pod \"neutron-db-create-zqz9v\" (UID: \"fff4b736-2214-4791-b0c5-6d909c396d53\") " pod="openstack/neutron-db-create-zqz9v" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.398133 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-db-sync-config-data\") pod \"watcher-db-sync-f52xm\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.398673 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-combined-ca-bundle\") pod \"watcher-db-sync-f52xm\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.398778 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-config-data\") pod \"watcher-db-sync-f52xm\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.421425 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc4lz\" (UniqueName: \"kubernetes.io/projected/84eeba03-fc43-4027-8528-dc161a147dfb-kube-api-access-nc4lz\") pod \"watcher-db-sync-f52xm\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.437734 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-p8g7f"] Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.442520 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.447869 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.448120 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.450782 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.450857 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-788q9" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.453269 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-f52xm" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.455402 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p8g7f"] Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.488239 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f53b24c-9498-4e81-815b-9817e4be03be-combined-ca-bundle\") pod \"keystone-db-sync-p8g7f\" (UID: \"4f53b24c-9498-4e81-815b-9817e4be03be\") " pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.488328 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv2m7\" (UniqueName: \"kubernetes.io/projected/4f53b24c-9498-4e81-815b-9817e4be03be-kube-api-access-sv2m7\") pod \"keystone-db-sync-p8g7f\" (UID: \"4f53b24c-9498-4e81-815b-9817e4be03be\") " pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.488455 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zblvq\" (UniqueName: \"kubernetes.io/projected/fff4b736-2214-4791-b0c5-6d909c396d53-kube-api-access-zblvq\") pod \"neutron-db-create-zqz9v\" (UID: \"fff4b736-2214-4791-b0c5-6d909c396d53\") " pod="openstack/neutron-db-create-zqz9v" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.488525 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f53b24c-9498-4e81-815b-9817e4be03be-config-data\") pod \"keystone-db-sync-p8g7f\" (UID: \"4f53b24c-9498-4e81-815b-9817e4be03be\") " pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.508549 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zblvq\" (UniqueName: \"kubernetes.io/projected/fff4b736-2214-4791-b0c5-6d909c396d53-kube-api-access-zblvq\") pod \"neutron-db-create-zqz9v\" (UID: \"fff4b736-2214-4791-b0c5-6d909c396d53\") " pod="openstack/neutron-db-create-zqz9v" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.607700 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zqz9v" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.608346 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f53b24c-9498-4e81-815b-9817e4be03be-combined-ca-bundle\") pod \"keystone-db-sync-p8g7f\" (UID: \"4f53b24c-9498-4e81-815b-9817e4be03be\") " pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.608396 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv2m7\" (UniqueName: \"kubernetes.io/projected/4f53b24c-9498-4e81-815b-9817e4be03be-kube-api-access-sv2m7\") pod \"keystone-db-sync-p8g7f\" (UID: \"4f53b24c-9498-4e81-815b-9817e4be03be\") " pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.608486 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f53b24c-9498-4e81-815b-9817e4be03be-config-data\") pod \"keystone-db-sync-p8g7f\" (UID: \"4f53b24c-9498-4e81-815b-9817e4be03be\") " pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.614937 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f53b24c-9498-4e81-815b-9817e4be03be-config-data\") pod \"keystone-db-sync-p8g7f\" (UID: \"4f53b24c-9498-4e81-815b-9817e4be03be\") " pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.619024 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f53b24c-9498-4e81-815b-9817e4be03be-combined-ca-bundle\") pod \"keystone-db-sync-p8g7f\" (UID: \"4f53b24c-9498-4e81-815b-9817e4be03be\") " pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.631695 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv2m7\" (UniqueName: \"kubernetes.io/projected/4f53b24c-9498-4e81-815b-9817e4be03be-kube-api-access-sv2m7\") pod \"keystone-db-sync-p8g7f\" (UID: \"4f53b24c-9498-4e81-815b-9817e4be03be\") " pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.687712 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-r947d"] Oct 13 18:30:43 crc kubenswrapper[4974]: W1013 18:30:43.693976 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod723dbb39_0ec4_4193_8fb9_307d311f962e.slice/crio-808c798c58ffdfb5de7739a2cd4a5d1bd557a0edc484c1f4a4de62f2b6c2019e WatchSource:0}: Error finding container 808c798c58ffdfb5de7739a2cd4a5d1bd557a0edc484c1f4a4de62f2b6c2019e: Status 404 returned error can't find the container with id 808c798c58ffdfb5de7739a2cd4a5d1bd557a0edc484c1f4a4de62f2b6c2019e Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.773407 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:30:43 crc kubenswrapper[4974]: W1013 18:30:43.847238 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84eeba03_fc43_4027_8528_dc161a147dfb.slice/crio-71fe90032b39edd3b0686eeeb7781b6f52b8f9a04b91441abe7d861a0f010b9b WatchSource:0}: Error finding container 71fe90032b39edd3b0686eeeb7781b6f52b8f9a04b91441abe7d861a0f010b9b: Status 404 returned error can't find the container with id 71fe90032b39edd3b0686eeeb7781b6f52b8f9a04b91441abe7d861a0f010b9b Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.850306 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-f52xm"] Oct 13 18:30:43 crc kubenswrapper[4974]: I1013 18:30:43.891933 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-pwgg6"] Oct 13 18:30:43 crc kubenswrapper[4974]: W1013 18:30:43.905319 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod839f3919_be07_4c53_9d6d_92e2ad6d0059.slice/crio-85741d3a19d257b2fcdb01560f93f0461c9e64a075da2d4aa5a6c4d7a8f18fb3 WatchSource:0}: Error finding container 85741d3a19d257b2fcdb01560f93f0461c9e64a075da2d4aa5a6c4d7a8f18fb3: Status 404 returned error can't find the container with id 85741d3a19d257b2fcdb01560f93f0461c9e64a075da2d4aa5a6c4d7a8f18fb3 Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.172098 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7mkgf" Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.177299 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zqz9v"] Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.326915 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw2zv\" (UniqueName: \"kubernetes.io/projected/7b802023-9db0-4145-8872-6fb2fb3f26e3-kube-api-access-pw2zv\") pod \"7b802023-9db0-4145-8872-6fb2fb3f26e3\" (UID: \"7b802023-9db0-4145-8872-6fb2fb3f26e3\") " Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.329400 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-p8g7f"] Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.332530 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b802023-9db0-4145-8872-6fb2fb3f26e3-kube-api-access-pw2zv" (OuterVolumeSpecName: "kube-api-access-pw2zv") pod "7b802023-9db0-4145-8872-6fb2fb3f26e3" (UID: "7b802023-9db0-4145-8872-6fb2fb3f26e3"). InnerVolumeSpecName "kube-api-access-pw2zv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:44 crc kubenswrapper[4974]: W1013 18:30:44.340977 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f53b24c_9498_4e81_815b_9817e4be03be.slice/crio-bfa8638bac8dad536a55f97976e0992f84a2dd83d1a2a1c0729a2385c80ea9b5 WatchSource:0}: Error finding container bfa8638bac8dad536a55f97976e0992f84a2dd83d1a2a1c0729a2385c80ea9b5: Status 404 returned error can't find the container with id bfa8638bac8dad536a55f97976e0992f84a2dd83d1a2a1c0729a2385c80ea9b5 Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.429010 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw2zv\" (UniqueName: \"kubernetes.io/projected/7b802023-9db0-4145-8872-6fb2fb3f26e3-kube-api-access-pw2zv\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.645929 4974 generic.go:334] "Generic (PLEG): container finished" podID="fff4b736-2214-4791-b0c5-6d909c396d53" containerID="4bbff5f7998054525385d30f74d9873da184ca2359e0ec25cabd8776691a8783" exitCode=0 Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.646042 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zqz9v" event={"ID":"fff4b736-2214-4791-b0c5-6d909c396d53","Type":"ContainerDied","Data":"4bbff5f7998054525385d30f74d9873da184ca2359e0ec25cabd8776691a8783"} Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.646076 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zqz9v" event={"ID":"fff4b736-2214-4791-b0c5-6d909c396d53","Type":"ContainerStarted","Data":"3e5ba3a60c0026bcf0af1256b7cedfc1906024a3933abbe0355eaf93ed53da66"} Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.657194 4974 generic.go:334] "Generic (PLEG): container finished" podID="839f3919-be07-4c53-9d6d-92e2ad6d0059" containerID="cf7079c43af91ccf2039a49d5c815174b0bc871585f8c595fe21f4e15f59748c" exitCode=0 Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.657271 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pwgg6" event={"ID":"839f3919-be07-4c53-9d6d-92e2ad6d0059","Type":"ContainerDied","Data":"cf7079c43af91ccf2039a49d5c815174b0bc871585f8c595fe21f4e15f59748c"} Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.657302 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pwgg6" event={"ID":"839f3919-be07-4c53-9d6d-92e2ad6d0059","Type":"ContainerStarted","Data":"85741d3a19d257b2fcdb01560f93f0461c9e64a075da2d4aa5a6c4d7a8f18fb3"} Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.682165 4974 generic.go:334] "Generic (PLEG): container finished" podID="723dbb39-0ec4-4193-8fb9-307d311f962e" containerID="8842edcc3e9698c59e0930af911f9443192bd37bd3d08e35af227c57ff1977e0" exitCode=0 Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.682268 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r947d" event={"ID":"723dbb39-0ec4-4193-8fb9-307d311f962e","Type":"ContainerDied","Data":"8842edcc3e9698c59e0930af911f9443192bd37bd3d08e35af227c57ff1977e0"} Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.682299 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r947d" event={"ID":"723dbb39-0ec4-4193-8fb9-307d311f962e","Type":"ContainerStarted","Data":"808c798c58ffdfb5de7739a2cd4a5d1bd557a0edc484c1f4a4de62f2b6c2019e"} Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.684734 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p8g7f" event={"ID":"4f53b24c-9498-4e81-815b-9817e4be03be","Type":"ContainerStarted","Data":"bfa8638bac8dad536a55f97976e0992f84a2dd83d1a2a1c0729a2385c80ea9b5"} Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.704893 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-f52xm" event={"ID":"84eeba03-fc43-4027-8528-dc161a147dfb","Type":"ContainerStarted","Data":"71fe90032b39edd3b0686eeeb7781b6f52b8f9a04b91441abe7d861a0f010b9b"} Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.720666 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-7mkgf" event={"ID":"7b802023-9db0-4145-8872-6fb2fb3f26e3","Type":"ContainerDied","Data":"3903548cb49abfc3324a8b6c1eaa65cf64a6ce493b4260fa01cad5a61d77a00f"} Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.720703 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3903548cb49abfc3324a8b6c1eaa65cf64a6ce493b4260fa01cad5a61d77a00f" Oct 13 18:30:44 crc kubenswrapper[4974]: I1013 18:30:44.720755 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-7mkgf" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.198200 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pwgg6" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.270000 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cc98p\" (UniqueName: \"kubernetes.io/projected/839f3919-be07-4c53-9d6d-92e2ad6d0059-kube-api-access-cc98p\") pod \"839f3919-be07-4c53-9d6d-92e2ad6d0059\" (UID: \"839f3919-be07-4c53-9d6d-92e2ad6d0059\") " Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.280644 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/839f3919-be07-4c53-9d6d-92e2ad6d0059-kube-api-access-cc98p" (OuterVolumeSpecName: "kube-api-access-cc98p") pod "839f3919-be07-4c53-9d6d-92e2ad6d0059" (UID: "839f3919-be07-4c53-9d6d-92e2ad6d0059"). InnerVolumeSpecName "kube-api-access-cc98p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.352937 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zqz9v" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.356513 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r947d" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.372011 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zblvq\" (UniqueName: \"kubernetes.io/projected/fff4b736-2214-4791-b0c5-6d909c396d53-kube-api-access-zblvq\") pod \"fff4b736-2214-4791-b0c5-6d909c396d53\" (UID: \"fff4b736-2214-4791-b0c5-6d909c396d53\") " Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.372108 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4w78\" (UniqueName: \"kubernetes.io/projected/723dbb39-0ec4-4193-8fb9-307d311f962e-kube-api-access-d4w78\") pod \"723dbb39-0ec4-4193-8fb9-307d311f962e\" (UID: \"723dbb39-0ec4-4193-8fb9-307d311f962e\") " Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.372513 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cc98p\" (UniqueName: \"kubernetes.io/projected/839f3919-be07-4c53-9d6d-92e2ad6d0059-kube-api-access-cc98p\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.381237 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff4b736-2214-4791-b0c5-6d909c396d53-kube-api-access-zblvq" (OuterVolumeSpecName: "kube-api-access-zblvq") pod "fff4b736-2214-4791-b0c5-6d909c396d53" (UID: "fff4b736-2214-4791-b0c5-6d909c396d53"). InnerVolumeSpecName "kube-api-access-zblvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.381297 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/723dbb39-0ec4-4193-8fb9-307d311f962e-kube-api-access-d4w78" (OuterVolumeSpecName: "kube-api-access-d4w78") pod "723dbb39-0ec4-4193-8fb9-307d311f962e" (UID: "723dbb39-0ec4-4193-8fb9-307d311f962e"). InnerVolumeSpecName "kube-api-access-d4w78". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.474824 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zblvq\" (UniqueName: \"kubernetes.io/projected/fff4b736-2214-4791-b0c5-6d909c396d53-kube-api-access-zblvq\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.474863 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4w78\" (UniqueName: \"kubernetes.io/projected/723dbb39-0ec4-4193-8fb9-307d311f962e-kube-api-access-d4w78\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.739842 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zqz9v" event={"ID":"fff4b736-2214-4791-b0c5-6d909c396d53","Type":"ContainerDied","Data":"3e5ba3a60c0026bcf0af1256b7cedfc1906024a3933abbe0355eaf93ed53da66"} Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.739883 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e5ba3a60c0026bcf0af1256b7cedfc1906024a3933abbe0355eaf93ed53da66" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.739955 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zqz9v" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.743323 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-pwgg6" event={"ID":"839f3919-be07-4c53-9d6d-92e2ad6d0059","Type":"ContainerDied","Data":"85741d3a19d257b2fcdb01560f93f0461c9e64a075da2d4aa5a6c4d7a8f18fb3"} Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.743373 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85741d3a19d257b2fcdb01560f93f0461c9e64a075da2d4aa5a6c4d7a8f18fb3" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.743436 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-pwgg6" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.746318 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-r947d" Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.746306 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-r947d" event={"ID":"723dbb39-0ec4-4193-8fb9-307d311f962e","Type":"ContainerDied","Data":"808c798c58ffdfb5de7739a2cd4a5d1bd557a0edc484c1f4a4de62f2b6c2019e"} Oct 13 18:30:46 crc kubenswrapper[4974]: I1013 18:30:46.746455 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="808c798c58ffdfb5de7739a2cd4a5d1bd557a0edc484c1f4a4de62f2b6c2019e" Oct 13 18:30:49 crc kubenswrapper[4974]: I1013 18:30:49.677097 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:49 crc kubenswrapper[4974]: I1013 18:30:49.683992 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:49 crc kubenswrapper[4974]: I1013 18:30:49.783193 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.347126 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-d9de-account-create-lw89h"] Oct 13 18:30:51 crc kubenswrapper[4974]: E1013 18:30:51.347876 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="839f3919-be07-4c53-9d6d-92e2ad6d0059" containerName="mariadb-database-create" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.347893 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="839f3919-be07-4c53-9d6d-92e2ad6d0059" containerName="mariadb-database-create" Oct 13 18:30:51 crc kubenswrapper[4974]: E1013 18:30:51.347937 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff4b736-2214-4791-b0c5-6d909c396d53" containerName="mariadb-database-create" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.347946 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff4b736-2214-4791-b0c5-6d909c396d53" containerName="mariadb-database-create" Oct 13 18:30:51 crc kubenswrapper[4974]: E1013 18:30:51.347968 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723dbb39-0ec4-4193-8fb9-307d311f962e" containerName="mariadb-database-create" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.347977 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="723dbb39-0ec4-4193-8fb9-307d311f962e" containerName="mariadb-database-create" Oct 13 18:30:51 crc kubenswrapper[4974]: E1013 18:30:51.348010 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b802023-9db0-4145-8872-6fb2fb3f26e3" containerName="mariadb-database-create" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.348020 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b802023-9db0-4145-8872-6fb2fb3f26e3" containerName="mariadb-database-create" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.348244 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b802023-9db0-4145-8872-6fb2fb3f26e3" containerName="mariadb-database-create" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.348279 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="839f3919-be07-4c53-9d6d-92e2ad6d0059" containerName="mariadb-database-create" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.348293 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff4b736-2214-4791-b0c5-6d909c396d53" containerName="mariadb-database-create" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.348312 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="723dbb39-0ec4-4193-8fb9-307d311f962e" containerName="mariadb-database-create" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.349086 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d9de-account-create-lw89h" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.351562 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.366478 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d9de-account-create-lw89h"] Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.458565 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtkqv\" (UniqueName: \"kubernetes.io/projected/a93751bb-94e4-430a-8e5c-2c5dd63bb013-kube-api-access-xtkqv\") pod \"glance-d9de-account-create-lw89h\" (UID: \"a93751bb-94e4-430a-8e5c-2c5dd63bb013\") " pod="openstack/glance-d9de-account-create-lw89h" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.559674 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtkqv\" (UniqueName: \"kubernetes.io/projected/a93751bb-94e4-430a-8e5c-2c5dd63bb013-kube-api-access-xtkqv\") pod \"glance-d9de-account-create-lw89h\" (UID: \"a93751bb-94e4-430a-8e5c-2c5dd63bb013\") " pod="openstack/glance-d9de-account-create-lw89h" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.583821 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtkqv\" (UniqueName: \"kubernetes.io/projected/a93751bb-94e4-430a-8e5c-2c5dd63bb013-kube-api-access-xtkqv\") pod \"glance-d9de-account-create-lw89h\" (UID: \"a93751bb-94e4-430a-8e5c-2c5dd63bb013\") " pod="openstack/glance-d9de-account-create-lw89h" Oct 13 18:30:51 crc kubenswrapper[4974]: I1013 18:30:51.672472 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d9de-account-create-lw89h" Oct 13 18:30:52 crc kubenswrapper[4974]: I1013 18:30:52.968028 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-ccf2-account-create-5h2vg"] Oct 13 18:30:52 crc kubenswrapper[4974]: I1013 18:30:52.969262 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ccf2-account-create-5h2vg" Oct 13 18:30:52 crc kubenswrapper[4974]: I1013 18:30:52.970969 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Oct 13 18:30:52 crc kubenswrapper[4974]: I1013 18:30:52.975499 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ccf2-account-create-5h2vg"] Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.001669 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjjbg\" (UniqueName: \"kubernetes.io/projected/0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b-kube-api-access-vjjbg\") pod \"cinder-ccf2-account-create-5h2vg\" (UID: \"0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b\") " pod="openstack/cinder-ccf2-account-create-5h2vg" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.103715 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjjbg\" (UniqueName: \"kubernetes.io/projected/0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b-kube-api-access-vjjbg\") pod \"cinder-ccf2-account-create-5h2vg\" (UID: \"0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b\") " pod="openstack/cinder-ccf2-account-create-5h2vg" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.120107 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjjbg\" (UniqueName: \"kubernetes.io/projected/0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b-kube-api-access-vjjbg\") pod \"cinder-ccf2-account-create-5h2vg\" (UID: \"0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b\") " pod="openstack/cinder-ccf2-account-create-5h2vg" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.167546 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-27f6-account-create-42tjc"] Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.169248 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-27f6-account-create-42tjc" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.174781 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.183077 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-27f6-account-create-42tjc"] Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.309469 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzrqn\" (UniqueName: \"kubernetes.io/projected/4f298626-ae2a-4e3d-8c73-f1813c3aa22d-kube-api-access-jzrqn\") pod \"barbican-27f6-account-create-42tjc\" (UID: \"4f298626-ae2a-4e3d-8c73-f1813c3aa22d\") " pod="openstack/barbican-27f6-account-create-42tjc" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.329101 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ccf2-account-create-5h2vg" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.361789 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-cb6f-account-create-2qvkf"] Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.363877 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cb6f-account-create-2qvkf" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.368099 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.369998 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cb6f-account-create-2qvkf"] Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.411784 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzrqn\" (UniqueName: \"kubernetes.io/projected/4f298626-ae2a-4e3d-8c73-f1813c3aa22d-kube-api-access-jzrqn\") pod \"barbican-27f6-account-create-42tjc\" (UID: \"4f298626-ae2a-4e3d-8c73-f1813c3aa22d\") " pod="openstack/barbican-27f6-account-create-42tjc" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.430603 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzrqn\" (UniqueName: \"kubernetes.io/projected/4f298626-ae2a-4e3d-8c73-f1813c3aa22d-kube-api-access-jzrqn\") pod \"barbican-27f6-account-create-42tjc\" (UID: \"4f298626-ae2a-4e3d-8c73-f1813c3aa22d\") " pod="openstack/barbican-27f6-account-create-42tjc" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.513362 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsht8\" (UniqueName: \"kubernetes.io/projected/1c510728-c859-42b5-ac31-a8c1c9b10b75-kube-api-access-qsht8\") pod \"neutron-cb6f-account-create-2qvkf\" (UID: \"1c510728-c859-42b5-ac31-a8c1c9b10b75\") " pod="openstack/neutron-cb6f-account-create-2qvkf" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.521569 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-27f6-account-create-42tjc" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.614714 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsht8\" (UniqueName: \"kubernetes.io/projected/1c510728-c859-42b5-ac31-a8c1c9b10b75-kube-api-access-qsht8\") pod \"neutron-cb6f-account-create-2qvkf\" (UID: \"1c510728-c859-42b5-ac31-a8c1c9b10b75\") " pod="openstack/neutron-cb6f-account-create-2qvkf" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.634855 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsht8\" (UniqueName: \"kubernetes.io/projected/1c510728-c859-42b5-ac31-a8c1c9b10b75-kube-api-access-qsht8\") pod \"neutron-cb6f-account-create-2qvkf\" (UID: \"1c510728-c859-42b5-ac31-a8c1c9b10b75\") " pod="openstack/neutron-cb6f-account-create-2qvkf" Oct 13 18:30:53 crc kubenswrapper[4974]: I1013 18:30:53.695373 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cb6f-account-create-2qvkf" Oct 13 18:30:55 crc kubenswrapper[4974]: I1013 18:30:55.737974 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-27f6-account-create-42tjc"] Oct 13 18:30:55 crc kubenswrapper[4974]: W1013 18:30:55.740262 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f298626_ae2a_4e3d_8c73_f1813c3aa22d.slice/crio-d0ae91c0e3f931e6893dc74789f0459c82c6ba411e76d57e874d7e60854d22c2 WatchSource:0}: Error finding container d0ae91c0e3f931e6893dc74789f0459c82c6ba411e76d57e874d7e60854d22c2: Status 404 returned error can't find the container with id d0ae91c0e3f931e6893dc74789f0459c82c6ba411e76d57e874d7e60854d22c2 Oct 13 18:30:55 crc kubenswrapper[4974]: I1013 18:30:55.804671 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-cb6f-account-create-2qvkf"] Oct 13 18:30:55 crc kubenswrapper[4974]: W1013 18:30:55.817105 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f6785f8_7dd2_4a77_8e6c_437cb7b2dd6b.slice/crio-1828fa2ae23bc96f0bb18dcff734f08f783d1b95fe0c125ea7b9b809eb338a58 WatchSource:0}: Error finding container 1828fa2ae23bc96f0bb18dcff734f08f783d1b95fe0c125ea7b9b809eb338a58: Status 404 returned error can't find the container with id 1828fa2ae23bc96f0bb18dcff734f08f783d1b95fe0c125ea7b9b809eb338a58 Oct 13 18:30:55 crc kubenswrapper[4974]: I1013 18:30:55.832257 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-ccf2-account-create-5h2vg"] Oct 13 18:30:55 crc kubenswrapper[4974]: I1013 18:30:55.839532 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-d9de-account-create-lw89h"] Oct 13 18:30:55 crc kubenswrapper[4974]: I1013 18:30:55.842784 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-f52xm" event={"ID":"84eeba03-fc43-4027-8528-dc161a147dfb","Type":"ContainerStarted","Data":"0db80815fbfe5f9e722c5f0be136e8718e2486ad22aa7a3e5c67111ffb7947a2"} Oct 13 18:30:55 crc kubenswrapper[4974]: I1013 18:30:55.846375 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ccf2-account-create-5h2vg" event={"ID":"0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b","Type":"ContainerStarted","Data":"1828fa2ae23bc96f0bb18dcff734f08f783d1b95fe0c125ea7b9b809eb338a58"} Oct 13 18:30:55 crc kubenswrapper[4974]: I1013 18:30:55.854498 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-27f6-account-create-42tjc" event={"ID":"4f298626-ae2a-4e3d-8c73-f1813c3aa22d","Type":"ContainerStarted","Data":"d0ae91c0e3f931e6893dc74789f0459c82c6ba411e76d57e874d7e60854d22c2"} Oct 13 18:30:55 crc kubenswrapper[4974]: I1013 18:30:55.856323 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cb6f-account-create-2qvkf" event={"ID":"1c510728-c859-42b5-ac31-a8c1c9b10b75","Type":"ContainerStarted","Data":"6d25b835a32e65f7d33fe1400621a7b78525bcab9607fbdf0d48b355220ecfaf"} Oct 13 18:30:55 crc kubenswrapper[4974]: I1013 18:30:55.857758 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p8g7f" event={"ID":"4f53b24c-9498-4e81-815b-9817e4be03be","Type":"ContainerStarted","Data":"4291043869c587289116d62157b7fcc81418609dbb592877d2acd2295ba448a8"} Oct 13 18:30:55 crc kubenswrapper[4974]: I1013 18:30:55.858756 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d9de-account-create-lw89h" event={"ID":"a93751bb-94e4-430a-8e5c-2c5dd63bb013","Type":"ContainerStarted","Data":"7b221745c911e6fd6407cc72bb969ce3af12a53926366e07870b5769a61f918d"} Oct 13 18:30:55 crc kubenswrapper[4974]: I1013 18:30:55.908241 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-f52xm" podStartSLOduration=1.4774285090000001 podStartE2EDuration="12.908223223s" podCreationTimestamp="2025-10-13 18:30:43 +0000 UTC" firstStartedPulling="2025-10-13 18:30:43.850378757 +0000 UTC m=+978.754744837" lastFinishedPulling="2025-10-13 18:30:55.281173471 +0000 UTC m=+990.185539551" observedRunningTime="2025-10-13 18:30:55.897302275 +0000 UTC m=+990.801668355" watchObservedRunningTime="2025-10-13 18:30:55.908223223 +0000 UTC m=+990.812589303" Oct 13 18:30:55 crc kubenswrapper[4974]: I1013 18:30:55.925989 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-p8g7f" podStartSLOduration=2.023119966 podStartE2EDuration="12.925958873s" podCreationTimestamp="2025-10-13 18:30:43 +0000 UTC" firstStartedPulling="2025-10-13 18:30:44.34307963 +0000 UTC m=+979.247445710" lastFinishedPulling="2025-10-13 18:30:55.245918537 +0000 UTC m=+990.150284617" observedRunningTime="2025-10-13 18:30:55.915026865 +0000 UTC m=+990.819392955" watchObservedRunningTime="2025-10-13 18:30:55.925958873 +0000 UTC m=+990.830324993" Oct 13 18:30:56 crc kubenswrapper[4974]: I1013 18:30:56.870575 4974 generic.go:334] "Generic (PLEG): container finished" podID="1c510728-c859-42b5-ac31-a8c1c9b10b75" containerID="d304d035e40cf3ccdd1c00da88f92fa05765850c29b4a27741add4ec95d53776" exitCode=0 Oct 13 18:30:56 crc kubenswrapper[4974]: I1013 18:30:56.870768 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cb6f-account-create-2qvkf" event={"ID":"1c510728-c859-42b5-ac31-a8c1c9b10b75","Type":"ContainerDied","Data":"d304d035e40cf3ccdd1c00da88f92fa05765850c29b4a27741add4ec95d53776"} Oct 13 18:30:56 crc kubenswrapper[4974]: I1013 18:30:56.874203 4974 generic.go:334] "Generic (PLEG): container finished" podID="a93751bb-94e4-430a-8e5c-2c5dd63bb013" containerID="aed0e2a969ca344755ad2a72349ef12ddb8d5b64a78579e6a544fd756c049ac3" exitCode=0 Oct 13 18:30:56 crc kubenswrapper[4974]: I1013 18:30:56.874274 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d9de-account-create-lw89h" event={"ID":"a93751bb-94e4-430a-8e5c-2c5dd63bb013","Type":"ContainerDied","Data":"aed0e2a969ca344755ad2a72349ef12ddb8d5b64a78579e6a544fd756c049ac3"} Oct 13 18:30:56 crc kubenswrapper[4974]: I1013 18:30:56.876543 4974 generic.go:334] "Generic (PLEG): container finished" podID="0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b" containerID="d7ebd1a0be85eda5dbe7c4a76310869b51136d3da0f3bf5b6c47c53a33d23230" exitCode=0 Oct 13 18:30:56 crc kubenswrapper[4974]: I1013 18:30:56.876691 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ccf2-account-create-5h2vg" event={"ID":"0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b","Type":"ContainerDied","Data":"d7ebd1a0be85eda5dbe7c4a76310869b51136d3da0f3bf5b6c47c53a33d23230"} Oct 13 18:30:56 crc kubenswrapper[4974]: I1013 18:30:56.883813 4974 generic.go:334] "Generic (PLEG): container finished" podID="4f298626-ae2a-4e3d-8c73-f1813c3aa22d" containerID="4a1dee7d8a1e0fd0eafc37f984fd73038436cd0e3927c9bb3b7f5977ae6bc713" exitCode=0 Oct 13 18:30:56 crc kubenswrapper[4974]: I1013 18:30:56.883913 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-27f6-account-create-42tjc" event={"ID":"4f298626-ae2a-4e3d-8c73-f1813c3aa22d","Type":"ContainerDied","Data":"4a1dee7d8a1e0fd0eafc37f984fd73038436cd0e3927c9bb3b7f5977ae6bc713"} Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.245698 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ccf2-account-create-5h2vg" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.415826 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjjbg\" (UniqueName: \"kubernetes.io/projected/0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b-kube-api-access-vjjbg\") pod \"0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b\" (UID: \"0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b\") " Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.422642 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b-kube-api-access-vjjbg" (OuterVolumeSpecName: "kube-api-access-vjjbg") pod "0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b" (UID: "0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b"). InnerVolumeSpecName "kube-api-access-vjjbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.424421 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-27f6-account-create-42tjc" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.472448 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cb6f-account-create-2qvkf" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.476948 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d9de-account-create-lw89h" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.518047 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjjbg\" (UniqueName: \"kubernetes.io/projected/0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b-kube-api-access-vjjbg\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.620002 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtkqv\" (UniqueName: \"kubernetes.io/projected/a93751bb-94e4-430a-8e5c-2c5dd63bb013-kube-api-access-xtkqv\") pod \"a93751bb-94e4-430a-8e5c-2c5dd63bb013\" (UID: \"a93751bb-94e4-430a-8e5c-2c5dd63bb013\") " Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.620185 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsht8\" (UniqueName: \"kubernetes.io/projected/1c510728-c859-42b5-ac31-a8c1c9b10b75-kube-api-access-qsht8\") pod \"1c510728-c859-42b5-ac31-a8c1c9b10b75\" (UID: \"1c510728-c859-42b5-ac31-a8c1c9b10b75\") " Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.620311 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzrqn\" (UniqueName: \"kubernetes.io/projected/4f298626-ae2a-4e3d-8c73-f1813c3aa22d-kube-api-access-jzrqn\") pod \"4f298626-ae2a-4e3d-8c73-f1813c3aa22d\" (UID: \"4f298626-ae2a-4e3d-8c73-f1813c3aa22d\") " Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.623450 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a93751bb-94e4-430a-8e5c-2c5dd63bb013-kube-api-access-xtkqv" (OuterVolumeSpecName: "kube-api-access-xtkqv") pod "a93751bb-94e4-430a-8e5c-2c5dd63bb013" (UID: "a93751bb-94e4-430a-8e5c-2c5dd63bb013"). InnerVolumeSpecName "kube-api-access-xtkqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.626927 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c510728-c859-42b5-ac31-a8c1c9b10b75-kube-api-access-qsht8" (OuterVolumeSpecName: "kube-api-access-qsht8") pod "1c510728-c859-42b5-ac31-a8c1c9b10b75" (UID: "1c510728-c859-42b5-ac31-a8c1c9b10b75"). InnerVolumeSpecName "kube-api-access-qsht8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.629710 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f298626-ae2a-4e3d-8c73-f1813c3aa22d-kube-api-access-jzrqn" (OuterVolumeSpecName: "kube-api-access-jzrqn") pod "4f298626-ae2a-4e3d-8c73-f1813c3aa22d" (UID: "4f298626-ae2a-4e3d-8c73-f1813c3aa22d"). InnerVolumeSpecName "kube-api-access-jzrqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.722344 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzrqn\" (UniqueName: \"kubernetes.io/projected/4f298626-ae2a-4e3d-8c73-f1813c3aa22d-kube-api-access-jzrqn\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.722373 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtkqv\" (UniqueName: \"kubernetes.io/projected/a93751bb-94e4-430a-8e5c-2c5dd63bb013-kube-api-access-xtkqv\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.722384 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsht8\" (UniqueName: \"kubernetes.io/projected/1c510728-c859-42b5-ac31-a8c1c9b10b75-kube-api-access-qsht8\") on node \"crc\" DevicePath \"\"" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.906066 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-d9de-account-create-lw89h" event={"ID":"a93751bb-94e4-430a-8e5c-2c5dd63bb013","Type":"ContainerDied","Data":"7b221745c911e6fd6407cc72bb969ce3af12a53926366e07870b5769a61f918d"} Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.906126 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b221745c911e6fd6407cc72bb969ce3af12a53926366e07870b5769a61f918d" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.906090 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-d9de-account-create-lw89h" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.909350 4974 generic.go:334] "Generic (PLEG): container finished" podID="84eeba03-fc43-4027-8528-dc161a147dfb" containerID="0db80815fbfe5f9e722c5f0be136e8718e2486ad22aa7a3e5c67111ffb7947a2" exitCode=0 Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.909395 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-f52xm" event={"ID":"84eeba03-fc43-4027-8528-dc161a147dfb","Type":"ContainerDied","Data":"0db80815fbfe5f9e722c5f0be136e8718e2486ad22aa7a3e5c67111ffb7947a2"} Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.912676 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-ccf2-account-create-5h2vg" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.912728 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-ccf2-account-create-5h2vg" event={"ID":"0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b","Type":"ContainerDied","Data":"1828fa2ae23bc96f0bb18dcff734f08f783d1b95fe0c125ea7b9b809eb338a58"} Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.912770 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1828fa2ae23bc96f0bb18dcff734f08f783d1b95fe0c125ea7b9b809eb338a58" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.915932 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-27f6-account-create-42tjc" event={"ID":"4f298626-ae2a-4e3d-8c73-f1813c3aa22d","Type":"ContainerDied","Data":"d0ae91c0e3f931e6893dc74789f0459c82c6ba411e76d57e874d7e60854d22c2"} Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.915961 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0ae91c0e3f931e6893dc74789f0459c82c6ba411e76d57e874d7e60854d22c2" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.916012 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-27f6-account-create-42tjc" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.920104 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-cb6f-account-create-2qvkf" event={"ID":"1c510728-c859-42b5-ac31-a8c1c9b10b75","Type":"ContainerDied","Data":"6d25b835a32e65f7d33fe1400621a7b78525bcab9607fbdf0d48b355220ecfaf"} Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.920189 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-cb6f-account-create-2qvkf" Oct 13 18:30:58 crc kubenswrapper[4974]: I1013 18:30:58.920197 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d25b835a32e65f7d33fe1400621a7b78525bcab9607fbdf0d48b355220ecfaf" Oct 13 18:30:59 crc kubenswrapper[4974]: I1013 18:30:59.931562 4974 generic.go:334] "Generic (PLEG): container finished" podID="4f53b24c-9498-4e81-815b-9817e4be03be" containerID="4291043869c587289116d62157b7fcc81418609dbb592877d2acd2295ba448a8" exitCode=0 Oct 13 18:30:59 crc kubenswrapper[4974]: I1013 18:30:59.931703 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p8g7f" event={"ID":"4f53b24c-9498-4e81-815b-9817e4be03be","Type":"ContainerDied","Data":"4291043869c587289116d62157b7fcc81418609dbb592877d2acd2295ba448a8"} Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.373955 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-f52xm" Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.558580 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-db-sync-config-data\") pod \"84eeba03-fc43-4027-8528-dc161a147dfb\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.558757 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc4lz\" (UniqueName: \"kubernetes.io/projected/84eeba03-fc43-4027-8528-dc161a147dfb-kube-api-access-nc4lz\") pod \"84eeba03-fc43-4027-8528-dc161a147dfb\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.558843 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-combined-ca-bundle\") pod \"84eeba03-fc43-4027-8528-dc161a147dfb\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.558936 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-config-data\") pod \"84eeba03-fc43-4027-8528-dc161a147dfb\" (UID: \"84eeba03-fc43-4027-8528-dc161a147dfb\") " Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.569147 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "84eeba03-fc43-4027-8528-dc161a147dfb" (UID: "84eeba03-fc43-4027-8528-dc161a147dfb"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.570148 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84eeba03-fc43-4027-8528-dc161a147dfb-kube-api-access-nc4lz" (OuterVolumeSpecName: "kube-api-access-nc4lz") pod "84eeba03-fc43-4027-8528-dc161a147dfb" (UID: "84eeba03-fc43-4027-8528-dc161a147dfb"). InnerVolumeSpecName "kube-api-access-nc4lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.615110 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84eeba03-fc43-4027-8528-dc161a147dfb" (UID: "84eeba03-fc43-4027-8528-dc161a147dfb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.653195 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-config-data" (OuterVolumeSpecName: "config-data") pod "84eeba03-fc43-4027-8528-dc161a147dfb" (UID: "84eeba03-fc43-4027-8528-dc161a147dfb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.667571 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.667606 4974 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.667670 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc4lz\" (UniqueName: \"kubernetes.io/projected/84eeba03-fc43-4027-8528-dc161a147dfb-kube-api-access-nc4lz\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.667683 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84eeba03-fc43-4027-8528-dc161a147dfb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.957972 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-f52xm" event={"ID":"84eeba03-fc43-4027-8528-dc161a147dfb","Type":"ContainerDied","Data":"71fe90032b39edd3b0686eeeb7781b6f52b8f9a04b91441abe7d861a0f010b9b"} Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.958048 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71fe90032b39edd3b0686eeeb7781b6f52b8f9a04b91441abe7d861a0f010b9b" Oct 13 18:31:00 crc kubenswrapper[4974]: I1013 18:31:00.958144 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-f52xm" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.359276 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.390309 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f53b24c-9498-4e81-815b-9817e4be03be-config-data\") pod \"4f53b24c-9498-4e81-815b-9817e4be03be\" (UID: \"4f53b24c-9498-4e81-815b-9817e4be03be\") " Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.390495 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f53b24c-9498-4e81-815b-9817e4be03be-combined-ca-bundle\") pod \"4f53b24c-9498-4e81-815b-9817e4be03be\" (UID: \"4f53b24c-9498-4e81-815b-9817e4be03be\") " Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.390522 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv2m7\" (UniqueName: \"kubernetes.io/projected/4f53b24c-9498-4e81-815b-9817e4be03be-kube-api-access-sv2m7\") pod \"4f53b24c-9498-4e81-815b-9817e4be03be\" (UID: \"4f53b24c-9498-4e81-815b-9817e4be03be\") " Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.407858 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f53b24c-9498-4e81-815b-9817e4be03be-kube-api-access-sv2m7" (OuterVolumeSpecName: "kube-api-access-sv2m7") pod "4f53b24c-9498-4e81-815b-9817e4be03be" (UID: "4f53b24c-9498-4e81-815b-9817e4be03be"). InnerVolumeSpecName "kube-api-access-sv2m7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.415326 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f53b24c-9498-4e81-815b-9817e4be03be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4f53b24c-9498-4e81-815b-9817e4be03be" (UID: "4f53b24c-9498-4e81-815b-9817e4be03be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.440845 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f53b24c-9498-4e81-815b-9817e4be03be-config-data" (OuterVolumeSpecName: "config-data") pod "4f53b24c-9498-4e81-815b-9817e4be03be" (UID: "4f53b24c-9498-4e81-815b-9817e4be03be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.493246 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f53b24c-9498-4e81-815b-9817e4be03be-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.493557 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f53b24c-9498-4e81-815b-9817e4be03be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.493572 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv2m7\" (UniqueName: \"kubernetes.io/projected/4f53b24c-9498-4e81-815b-9817e4be03be-kube-api-access-sv2m7\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.596207 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-jgthj"] Oct 13 18:31:01 crc kubenswrapper[4974]: E1013 18:31:01.596729 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f53b24c-9498-4e81-815b-9817e4be03be" containerName="keystone-db-sync" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.596755 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f53b24c-9498-4e81-815b-9817e4be03be" containerName="keystone-db-sync" Oct 13 18:31:01 crc kubenswrapper[4974]: E1013 18:31:01.596792 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c510728-c859-42b5-ac31-a8c1c9b10b75" containerName="mariadb-account-create" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.596803 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c510728-c859-42b5-ac31-a8c1c9b10b75" containerName="mariadb-account-create" Oct 13 18:31:01 crc kubenswrapper[4974]: E1013 18:31:01.596823 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a93751bb-94e4-430a-8e5c-2c5dd63bb013" containerName="mariadb-account-create" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.596838 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="a93751bb-94e4-430a-8e5c-2c5dd63bb013" containerName="mariadb-account-create" Oct 13 18:31:01 crc kubenswrapper[4974]: E1013 18:31:01.596861 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84eeba03-fc43-4027-8528-dc161a147dfb" containerName="watcher-db-sync" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.596871 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="84eeba03-fc43-4027-8528-dc161a147dfb" containerName="watcher-db-sync" Oct 13 18:31:01 crc kubenswrapper[4974]: E1013 18:31:01.596903 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b" containerName="mariadb-account-create" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.596913 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b" containerName="mariadb-account-create" Oct 13 18:31:01 crc kubenswrapper[4974]: E1013 18:31:01.596926 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f298626-ae2a-4e3d-8c73-f1813c3aa22d" containerName="mariadb-account-create" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.596938 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f298626-ae2a-4e3d-8c73-f1813c3aa22d" containerName="mariadb-account-create" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.597207 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b" containerName="mariadb-account-create" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.597277 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="a93751bb-94e4-430a-8e5c-2c5dd63bb013" containerName="mariadb-account-create" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.597316 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f53b24c-9498-4e81-815b-9817e4be03be" containerName="keystone-db-sync" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.597343 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="84eeba03-fc43-4027-8528-dc161a147dfb" containerName="watcher-db-sync" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.597368 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c510728-c859-42b5-ac31-a8c1c9b10b75" containerName="mariadb-account-create" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.597386 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f298626-ae2a-4e3d-8c73-f1813c3aa22d" containerName="mariadb-account-create" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.598619 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.602717 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.605949 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-nt8rf" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.612290 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jgthj"] Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.696911 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-db-sync-config-data\") pod \"glance-db-sync-jgthj\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.697090 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-combined-ca-bundle\") pod \"glance-db-sync-jgthj\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.697200 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-config-data\") pod \"glance-db-sync-jgthj\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.697464 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtwx7\" (UniqueName: \"kubernetes.io/projected/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-kube-api-access-gtwx7\") pod \"glance-db-sync-jgthj\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.799797 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtwx7\" (UniqueName: \"kubernetes.io/projected/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-kube-api-access-gtwx7\") pod \"glance-db-sync-jgthj\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.799876 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-db-sync-config-data\") pod \"glance-db-sync-jgthj\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.799976 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-combined-ca-bundle\") pod \"glance-db-sync-jgthj\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.800007 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-config-data\") pod \"glance-db-sync-jgthj\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.804293 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-config-data\") pod \"glance-db-sync-jgthj\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.808006 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-db-sync-config-data\") pod \"glance-db-sync-jgthj\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.823238 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-combined-ca-bundle\") pod \"glance-db-sync-jgthj\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.833751 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtwx7\" (UniqueName: \"kubernetes.io/projected/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-kube-api-access-gtwx7\") pod \"glance-db-sync-jgthj\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.919075 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jgthj" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.981277 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-p8g7f" event={"ID":"4f53b24c-9498-4e81-815b-9817e4be03be","Type":"ContainerDied","Data":"bfa8638bac8dad536a55f97976e0992f84a2dd83d1a2a1c0729a2385c80ea9b5"} Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.981356 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfa8638bac8dad536a55f97976e0992f84a2dd83d1a2a1c0729a2385c80ea9b5" Oct 13 18:31:01 crc kubenswrapper[4974]: I1013 18:31:01.981559 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-p8g7f" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.242298 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-755dd945cf-28mpc"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.243734 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.263711 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-755dd945cf-28mpc"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.276384 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-t64xt"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.278667 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.282755 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.283074 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.283200 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.283298 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-788q9" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.307601 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4cvn\" (UniqueName: \"kubernetes.io/projected/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-kube-api-access-t4cvn\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.307699 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-config-data\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.307756 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-ovsdbserver-nb\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.307788 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-config\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.307804 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-combined-ca-bundle\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.307821 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-credential-keys\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.307836 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxdpl\" (UniqueName: \"kubernetes.io/projected/97890884-a647-48e7-8e81-be963e2b290b-kube-api-access-dxdpl\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.307868 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-fernet-keys\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.307887 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-dns-swift-storage-0\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.307904 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-ovsdbserver-sb\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.307919 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-scripts\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.307953 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-dns-svc\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.319606 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t64xt"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.383130 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.384330 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.395177 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.400003 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-dv7bj" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.410568 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4cvn\" (UniqueName: \"kubernetes.io/projected/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-kube-api-access-t4cvn\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.410629 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-config-data\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.410664 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/751ed806-3991-4b72-91fb-ff5d56c66849-config-data\") pod \"watcher-applier-0\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.410710 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-ovsdbserver-nb\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.410738 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/751ed806-3991-4b72-91fb-ff5d56c66849-logs\") pod \"watcher-applier-0\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.410760 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-config\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.410777 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-combined-ca-bundle\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.410797 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-credential-keys\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.413919 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxdpl\" (UniqueName: \"kubernetes.io/projected/97890884-a647-48e7-8e81-be963e2b290b-kube-api-access-dxdpl\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.413972 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pps8k\" (UniqueName: \"kubernetes.io/projected/751ed806-3991-4b72-91fb-ff5d56c66849-kube-api-access-pps8k\") pod \"watcher-applier-0\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.414025 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-fernet-keys\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.414047 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-dns-swift-storage-0\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.414066 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-ovsdbserver-sb\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.414085 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-scripts\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.414135 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-dns-svc\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.414167 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751ed806-3991-4b72-91fb-ff5d56c66849-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.414752 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.414760 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-ovsdbserver-nb\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.414935 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-config\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.424016 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-credential-keys\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.424552 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-dns-swift-storage-0\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.428089 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-scripts\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.429174 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-ovsdbserver-sb\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.429588 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-dns-svc\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.431256 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-combined-ca-bundle\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.439840 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-config-data\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.452970 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.454339 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.455810 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4cvn\" (UniqueName: \"kubernetes.io/projected/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-kube-api-access-t4cvn\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.464595 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-fernet-keys\") pod \"keystone-bootstrap-t64xt\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.483849 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.493841 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.517884 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/751ed806-3991-4b72-91fb-ff5d56c66849-config-data\") pod \"watcher-applier-0\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.517929 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.517979 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.518027 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-config-data\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.518053 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/751ed806-3991-4b72-91fb-ff5d56c66849-logs\") pod \"watcher-applier-0\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.518085 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pps8k\" (UniqueName: \"kubernetes.io/projected/751ed806-3991-4b72-91fb-ff5d56c66849-kube-api-access-pps8k\") pod \"watcher-applier-0\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.518103 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-logs\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.518158 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751ed806-3991-4b72-91fb-ff5d56c66849-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.518195 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxvct\" (UniqueName: \"kubernetes.io/projected/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-kube-api-access-zxvct\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.519924 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/751ed806-3991-4b72-91fb-ff5d56c66849-logs\") pod \"watcher-applier-0\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.520826 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.521732 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/751ed806-3991-4b72-91fb-ff5d56c66849-config-data\") pod \"watcher-applier-0\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.522256 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.522256 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxdpl\" (UniqueName: \"kubernetes.io/projected/97890884-a647-48e7-8e81-be963e2b290b-kube-api-access-dxdpl\") pod \"dnsmasq-dns-755dd945cf-28mpc\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.532594 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751ed806-3991-4b72-91fb-ff5d56c66849-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.570987 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.571524 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.584551 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-784fdb6789-wc4j6"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.586041 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.601289 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pps8k\" (UniqueName: \"kubernetes.io/projected/751ed806-3991-4b72-91fb-ff5d56c66849-kube-api-access-pps8k\") pod \"watcher-applier-0\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.609269 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.609837 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.609970 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.609971 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-ctpps" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.620902 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.624266 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs7nw\" (UniqueName: \"kubernetes.io/projected/e2289a84-2fbc-42b5-884a-22f9134e8e15-kube-api-access-xs7nw\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.624315 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2289a84-2fbc-42b5-884a-22f9134e8e15-logs\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.624356 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13cd3ab5-2164-479f-bff9-8cac3b968f77-logs\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.624382 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2289a84-2fbc-42b5-884a-22f9134e8e15-scripts\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.624439 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvct\" (UniqueName: \"kubernetes.io/projected/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-kube-api-access-zxvct\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.624477 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.624518 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.624546 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2289a84-2fbc-42b5-884a-22f9134e8e15-config-data\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.624564 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e2289a84-2fbc-42b5-884a-22f9134e8e15-horizon-secret-key\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.624590 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-config-data\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.624614 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-config-data\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.624634 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-logs\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.625136 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.625170 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8snqs\" (UniqueName: \"kubernetes.io/projected/13cd3ab5-2164-479f-bff9-8cac3b968f77-kube-api-access-8snqs\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.625187 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.625221 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-784fdb6789-wc4j6"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.632746 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-logs\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.636594 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.637085 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.654384 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.656460 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-config-data\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.678221 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxvct\" (UniqueName: \"kubernetes.io/projected/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-kube-api-access-zxvct\") pod \"watcher-decision-engine-0\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.721305 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.727625 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2289a84-2fbc-42b5-884a-22f9134e8e15-config-data\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.727680 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e2289a84-2fbc-42b5-884a-22f9134e8e15-horizon-secret-key\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.727718 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-config-data\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.727751 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.727779 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8snqs\" (UniqueName: \"kubernetes.io/projected/13cd3ab5-2164-479f-bff9-8cac3b968f77-kube-api-access-8snqs\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.727794 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.727817 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xs7nw\" (UniqueName: \"kubernetes.io/projected/e2289a84-2fbc-42b5-884a-22f9134e8e15-kube-api-access-xs7nw\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.727836 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2289a84-2fbc-42b5-884a-22f9134e8e15-logs\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.727859 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13cd3ab5-2164-479f-bff9-8cac3b968f77-logs\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.727877 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2289a84-2fbc-42b5-884a-22f9134e8e15-scripts\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.729111 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2289a84-2fbc-42b5-884a-22f9134e8e15-logs\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.753234 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e2289a84-2fbc-42b5-884a-22f9134e8e15-horizon-secret-key\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.760872 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2289a84-2fbc-42b5-884a-22f9134e8e15-config-data\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.766639 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2289a84-2fbc-42b5-884a-22f9134e8e15-scripts\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.780637 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs7nw\" (UniqueName: \"kubernetes.io/projected/e2289a84-2fbc-42b5-884a-22f9134e8e15-kube-api-access-xs7nw\") pod \"horizon-784fdb6789-wc4j6\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.806374 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8snqs\" (UniqueName: \"kubernetes.io/projected/13cd3ab5-2164-479f-bff9-8cac3b968f77-kube-api-access-8snqs\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.806756 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.809485 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-config-data\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.818549 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.823919 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13cd3ab5-2164-479f-bff9-8cac3b968f77-logs\") pod \"watcher-api-0\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.839849 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-484wv"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.840973 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-484wv" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.851447 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-484wv"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.860375 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.860539 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nrvcn" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.860643 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.881599 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-755dd945cf-28mpc"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.891292 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.968453 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.978716 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5fcccd5645-2z74m"] Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.981119 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:02 crc kubenswrapper[4974]: I1013 18:31:02.984589 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.009943 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.019745 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.022577 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.023921 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.035081 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fcccd5645-2z74m"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.039919 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-config-data\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.039962 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-combined-ca-bundle\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.039994 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gvmq\" (UniqueName: \"kubernetes.io/projected/49fb758b-0291-4449-aa49-7e191cc1b2dc-kube-api-access-6gvmq\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.040048 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-scripts\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.040080 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49fb758b-0291-4449-aa49-7e191cc1b2dc-logs\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.044993 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58d9fd7b6c-5j6jb"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.046541 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.088384 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.099082 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58d9fd7b6c-5j6jb"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.141770 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-scripts\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.141813 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/300c099d-eca4-4f0c-a79f-dde4dddd8a98-log-httpd\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.141855 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49fb758b-0291-4449-aa49-7e191cc1b2dc-logs\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.141885 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a480348-b0db-489e-be33-a93c1c6d311f-logs\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.141901 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.141949 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-config-data\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.141974 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-combined-ca-bundle\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.142009 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-scripts\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.142037 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gvmq\" (UniqueName: \"kubernetes.io/projected/49fb758b-0291-4449-aa49-7e191cc1b2dc-kube-api-access-6gvmq\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.142061 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a480348-b0db-489e-be33-a93c1c6d311f-scripts\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.142079 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a480348-b0db-489e-be33-a93c1c6d311f-config-data\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.142107 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppvzc\" (UniqueName: \"kubernetes.io/projected/1a480348-b0db-489e-be33-a93c1c6d311f-kube-api-access-ppvzc\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.142127 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxrvw\" (UniqueName: \"kubernetes.io/projected/300c099d-eca4-4f0c-a79f-dde4dddd8a98-kube-api-access-sxrvw\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.142145 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/300c099d-eca4-4f0c-a79f-dde4dddd8a98-run-httpd\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.142161 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-config-data\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.142177 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.142198 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a480348-b0db-489e-be33-a93c1c6d311f-horizon-secret-key\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.147126 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49fb758b-0291-4449-aa49-7e191cc1b2dc-logs\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.155963 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-scripts\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.161681 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-combined-ca-bundle\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.162913 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-config-data\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.188229 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gvmq\" (UniqueName: \"kubernetes.io/projected/49fb758b-0291-4449-aa49-7e191cc1b2dc-kube-api-access-6gvmq\") pod \"placement-db-sync-484wv\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.198038 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-jgthj"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.220786 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-njgsh"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.222129 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.226407 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.226572 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.226838 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9s2zv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.238586 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-484wv" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245293 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-dns-swift-storage-0\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245521 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a480348-b0db-489e-be33-a93c1c6d311f-scripts\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245545 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a480348-b0db-489e-be33-a93c1c6d311f-config-data\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245574 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-dns-svc\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245595 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppvzc\" (UniqueName: \"kubernetes.io/projected/1a480348-b0db-489e-be33-a93c1c6d311f-kube-api-access-ppvzc\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245615 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxrvw\" (UniqueName: \"kubernetes.io/projected/300c099d-eca4-4f0c-a79f-dde4dddd8a98-kube-api-access-sxrvw\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245632 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/300c099d-eca4-4f0c-a79f-dde4dddd8a98-run-httpd\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245665 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-config-data\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245679 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245700 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a480348-b0db-489e-be33-a93c1c6d311f-horizon-secret-key\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245717 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-ovsdbserver-nb\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245744 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/300c099d-eca4-4f0c-a79f-dde4dddd8a98-log-httpd\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.245770 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7rbf\" (UniqueName: \"kubernetes.io/projected/7ec9f25a-64d8-4904-b1df-03678b32a639-kube-api-access-z7rbf\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.247330 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a480348-b0db-489e-be33-a93c1c6d311f-logs\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.247360 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.247445 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-ovsdbserver-sb\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.247483 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-config\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.247507 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-scripts\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.249146 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-njgsh"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.249170 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a480348-b0db-489e-be33-a93c1c6d311f-scripts\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.250139 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a480348-b0db-489e-be33-a93c1c6d311f-config-data\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.250321 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/300c099d-eca4-4f0c-a79f-dde4dddd8a98-log-httpd\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.250545 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/300c099d-eca4-4f0c-a79f-dde4dddd8a98-run-httpd\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.255800 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a480348-b0db-489e-be33-a93c1c6d311f-horizon-secret-key\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.264347 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.264858 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.265809 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-config-data\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.269354 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a480348-b0db-489e-be33-a93c1c6d311f-logs\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.252844 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-scripts\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.279899 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppvzc\" (UniqueName: \"kubernetes.io/projected/1a480348-b0db-489e-be33-a93c1c6d311f-kube-api-access-ppvzc\") pod \"horizon-5fcccd5645-2z74m\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.314063 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxrvw\" (UniqueName: \"kubernetes.io/projected/300c099d-eca4-4f0c-a79f-dde4dddd8a98-kube-api-access-sxrvw\") pod \"ceilometer-0\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.330505 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.349424 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7rbf\" (UniqueName: \"kubernetes.io/projected/7ec9f25a-64d8-4904-b1df-03678b32a639-kube-api-access-z7rbf\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.349486 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-config-data\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.349533 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbb85\" (UniqueName: \"kubernetes.io/projected/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-kube-api-access-zbb85\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.349586 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-etc-machine-id\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.349609 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-ovsdbserver-sb\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.349668 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-scripts\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.349685 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-combined-ca-bundle\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.349723 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-config\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.349773 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-dns-swift-storage-0\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.349809 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-dns-svc\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.349841 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-db-sync-config-data\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.349864 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-ovsdbserver-nb\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.350770 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-ovsdbserver-nb\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.352408 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-config\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.352439 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-ovsdbserver-sb\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.353034 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-dns-svc\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.353525 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.353674 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-dns-swift-storage-0\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.353895 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-755dd945cf-28mpc"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.372720 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7rbf\" (UniqueName: \"kubernetes.io/projected/7ec9f25a-64d8-4904-b1df-03678b32a639-kube-api-access-z7rbf\") pod \"dnsmasq-dns-58d9fd7b6c-5j6jb\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.378671 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:03 crc kubenswrapper[4974]: W1013 18:31:03.417317 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97890884_a647_48e7_8e81_be963e2b290b.slice/crio-902fd3704ce67965ebc5718eab5a09fd6b2326b9cd28ca9b50aff6f8e18dcdb8 WatchSource:0}: Error finding container 902fd3704ce67965ebc5718eab5a09fd6b2326b9cd28ca9b50aff6f8e18dcdb8: Status 404 returned error can't find the container with id 902fd3704ce67965ebc5718eab5a09fd6b2326b9cd28ca9b50aff6f8e18dcdb8 Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.453181 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-config-data\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.453261 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbb85\" (UniqueName: \"kubernetes.io/projected/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-kube-api-access-zbb85\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.453322 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-etc-machine-id\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.453345 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-scripts\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.453363 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-combined-ca-bundle\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.453518 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-db-sync-config-data\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.453999 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-etc-machine-id\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.457224 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-db-sync-config-data\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.460242 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-scripts\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.460285 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-combined-ca-bundle\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.461721 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-config-data\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.484245 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbb85\" (UniqueName: \"kubernetes.io/projected/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-kube-api-access-zbb85\") pod \"cinder-db-sync-njgsh\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.574777 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-njgsh" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.676329 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-l2v67"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.677725 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.708217 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-77n5s" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.708464 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.721432 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-l2v67"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.754084 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-w9hqc"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.755564 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.757945 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hsz8q" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.758378 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.758623 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.776996 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-w9hqc"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.830399 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.865413 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bht4g\" (UniqueName: \"kubernetes.io/projected/f29d4def-3da3-43a3-8331-f1ee4644dad2-kube-api-access-bht4g\") pod \"barbican-db-sync-l2v67\" (UID: \"f29d4def-3da3-43a3-8331-f1ee4644dad2\") " pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.865462 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f038500-ef77-401b-9e63-755ea9a56695-combined-ca-bundle\") pod \"neutron-db-sync-w9hqc\" (UID: \"2f038500-ef77-401b-9e63-755ea9a56695\") " pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.865526 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f038500-ef77-401b-9e63-755ea9a56695-config\") pod \"neutron-db-sync-w9hqc\" (UID: \"2f038500-ef77-401b-9e63-755ea9a56695\") " pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.865557 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29d4def-3da3-43a3-8331-f1ee4644dad2-combined-ca-bundle\") pod \"barbican-db-sync-l2v67\" (UID: \"f29d4def-3da3-43a3-8331-f1ee4644dad2\") " pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.865601 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmmvx\" (UniqueName: \"kubernetes.io/projected/2f038500-ef77-401b-9e63-755ea9a56695-kube-api-access-fmmvx\") pod \"neutron-db-sync-w9hqc\" (UID: \"2f038500-ef77-401b-9e63-755ea9a56695\") " pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.865635 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f29d4def-3da3-43a3-8331-f1ee4644dad2-db-sync-config-data\") pod \"barbican-db-sync-l2v67\" (UID: \"f29d4def-3da3-43a3-8331-f1ee4644dad2\") " pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.895385 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-t64xt"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.966752 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f038500-ef77-401b-9e63-755ea9a56695-config\") pod \"neutron-db-sync-w9hqc\" (UID: \"2f038500-ef77-401b-9e63-755ea9a56695\") " pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.966823 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29d4def-3da3-43a3-8331-f1ee4644dad2-combined-ca-bundle\") pod \"barbican-db-sync-l2v67\" (UID: \"f29d4def-3da3-43a3-8331-f1ee4644dad2\") " pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.966981 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmmvx\" (UniqueName: \"kubernetes.io/projected/2f038500-ef77-401b-9e63-755ea9a56695-kube-api-access-fmmvx\") pod \"neutron-db-sync-w9hqc\" (UID: \"2f038500-ef77-401b-9e63-755ea9a56695\") " pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.967034 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f29d4def-3da3-43a3-8331-f1ee4644dad2-db-sync-config-data\") pod \"barbican-db-sync-l2v67\" (UID: \"f29d4def-3da3-43a3-8331-f1ee4644dad2\") " pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.967107 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bht4g\" (UniqueName: \"kubernetes.io/projected/f29d4def-3da3-43a3-8331-f1ee4644dad2-kube-api-access-bht4g\") pod \"barbican-db-sync-l2v67\" (UID: \"f29d4def-3da3-43a3-8331-f1ee4644dad2\") " pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.967139 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f038500-ef77-401b-9e63-755ea9a56695-combined-ca-bundle\") pod \"neutron-db-sync-w9hqc\" (UID: \"2f038500-ef77-401b-9e63-755ea9a56695\") " pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.971272 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f29d4def-3da3-43a3-8331-f1ee4644dad2-db-sync-config-data\") pod \"barbican-db-sync-l2v67\" (UID: \"f29d4def-3da3-43a3-8331-f1ee4644dad2\") " pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.975248 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f038500-ef77-401b-9e63-755ea9a56695-combined-ca-bundle\") pod \"neutron-db-sync-w9hqc\" (UID: \"2f038500-ef77-401b-9e63-755ea9a56695\") " pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.977494 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f038500-ef77-401b-9e63-755ea9a56695-config\") pod \"neutron-db-sync-w9hqc\" (UID: \"2f038500-ef77-401b-9e63-755ea9a56695\") " pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.978682 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29d4def-3da3-43a3-8331-f1ee4644dad2-combined-ca-bundle\") pod \"barbican-db-sync-l2v67\" (UID: \"f29d4def-3da3-43a3-8331-f1ee4644dad2\") " pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.979565 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.986356 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bht4g\" (UniqueName: \"kubernetes.io/projected/f29d4def-3da3-43a3-8331-f1ee4644dad2-kube-api-access-bht4g\") pod \"barbican-db-sync-l2v67\" (UID: \"f29d4def-3da3-43a3-8331-f1ee4644dad2\") " pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:03 crc kubenswrapper[4974]: I1013 18:31:03.990332 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmmvx\" (UniqueName: \"kubernetes.io/projected/2f038500-ef77-401b-9e63-755ea9a56695-kube-api-access-fmmvx\") pod \"neutron-db-sync-w9hqc\" (UID: \"2f038500-ef77-401b-9e63-755ea9a56695\") " pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.039140 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.042770 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t64xt" event={"ID":"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f","Type":"ContainerStarted","Data":"347de3a34256615fbdc9e2dba681fab193b19bb27ea0279a4a7553cd66e73e83"} Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.049222 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"13cd3ab5-2164-479f-bff9-8cac3b968f77","Type":"ContainerStarted","Data":"e8afe6effa0077347206cd48588be68b38feb70f365ae154c47d5ad1cd83f1c3"} Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.058399 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"751ed806-3991-4b72-91fb-ff5d56c66849","Type":"ContainerStarted","Data":"d5353e925e50561d2ebc6a87d768334b873495b6fdd00da63a7fe267cc51d326"} Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.068553 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.069439 4974 generic.go:334] "Generic (PLEG): container finished" podID="97890884-a647-48e7-8e81-be963e2b290b" containerID="e2d45ec80db4d6f8cc278511141e2275a5c60059ee6484cca187886ab0ebd0a1" exitCode=0 Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.069558 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-755dd945cf-28mpc" event={"ID":"97890884-a647-48e7-8e81-be963e2b290b","Type":"ContainerDied","Data":"e2d45ec80db4d6f8cc278511141e2275a5c60059ee6484cca187886ab0ebd0a1"} Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.069589 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-755dd945cf-28mpc" event={"ID":"97890884-a647-48e7-8e81-be963e2b290b","Type":"ContainerStarted","Data":"902fd3704ce67965ebc5718eab5a09fd6b2326b9cd28ca9b50aff6f8e18dcdb8"} Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.074567 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jgthj" event={"ID":"ab455ee5-f8bd-4d4e-b179-524fa3edcc52","Type":"ContainerStarted","Data":"87be49189833f50e15e2f4ef25769f6d16a9f57a9dca906d77c8edb3cfa15611"} Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.094838 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.157368 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58d9fd7b6c-5j6jb"] Oct 13 18:31:04 crc kubenswrapper[4974]: W1013 18:31:04.195364 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2289a84_2fbc_42b5_884a_22f9134e8e15.slice/crio-099dc291413330a09b1a30f43e1c80451f937b4117b405ca45d1b037b652df60 WatchSource:0}: Error finding container 099dc291413330a09b1a30f43e1c80451f937b4117b405ca45d1b037b652df60: Status 404 returned error can't find the container with id 099dc291413330a09b1a30f43e1c80451f937b4117b405ca45d1b037b652df60 Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.196243 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-784fdb6789-wc4j6"] Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.211987 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-484wv"] Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.341320 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:31:04 crc kubenswrapper[4974]: W1013 18:31:04.357703 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod300c099d_eca4_4f0c_a79f_dde4dddd8a98.slice/crio-e7ae00adc6008de72d279c24b5e188e9839442bb1950329bd992e04f65d16807 WatchSource:0}: Error finding container e7ae00adc6008de72d279c24b5e188e9839442bb1950329bd992e04f65d16807: Status 404 returned error can't find the container with id e7ae00adc6008de72d279c24b5e188e9839442bb1950329bd992e04f65d16807 Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.533589 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-njgsh"] Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.555955 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5fcccd5645-2z74m"] Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.595763 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.683683 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxdpl\" (UniqueName: \"kubernetes.io/projected/97890884-a647-48e7-8e81-be963e2b290b-kube-api-access-dxdpl\") pod \"97890884-a647-48e7-8e81-be963e2b290b\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.684094 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-ovsdbserver-nb\") pod \"97890884-a647-48e7-8e81-be963e2b290b\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.684139 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-config\") pod \"97890884-a647-48e7-8e81-be963e2b290b\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.684193 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-dns-svc\") pod \"97890884-a647-48e7-8e81-be963e2b290b\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.684223 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-ovsdbserver-sb\") pod \"97890884-a647-48e7-8e81-be963e2b290b\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.684256 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-dns-swift-storage-0\") pod \"97890884-a647-48e7-8e81-be963e2b290b\" (UID: \"97890884-a647-48e7-8e81-be963e2b290b\") " Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.723452 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-l2v67"] Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.744893 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97890884-a647-48e7-8e81-be963e2b290b-kube-api-access-dxdpl" (OuterVolumeSpecName: "kube-api-access-dxdpl") pod "97890884-a647-48e7-8e81-be963e2b290b" (UID: "97890884-a647-48e7-8e81-be963e2b290b"). InnerVolumeSpecName "kube-api-access-dxdpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:31:04 crc kubenswrapper[4974]: W1013 18:31:04.772962 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf29d4def_3da3_43a3_8331_f1ee4644dad2.slice/crio-a8ddb95503c6e81d03b2773af57e9348661c1e726c20422690017805867446fe WatchSource:0}: Error finding container a8ddb95503c6e81d03b2773af57e9348661c1e726c20422690017805867446fe: Status 404 returned error can't find the container with id a8ddb95503c6e81d03b2773af57e9348661c1e726c20422690017805867446fe Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.788593 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxdpl\" (UniqueName: \"kubernetes.io/projected/97890884-a647-48e7-8e81-be963e2b290b-kube-api-access-dxdpl\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.913125 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "97890884-a647-48e7-8e81-be963e2b290b" (UID: "97890884-a647-48e7-8e81-be963e2b290b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.929406 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "97890884-a647-48e7-8e81-be963e2b290b" (UID: "97890884-a647-48e7-8e81-be963e2b290b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.942184 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-config" (OuterVolumeSpecName: "config") pod "97890884-a647-48e7-8e81-be963e2b290b" (UID: "97890884-a647-48e7-8e81-be963e2b290b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.944134 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "97890884-a647-48e7-8e81-be963e2b290b" (UID: "97890884-a647-48e7-8e81-be963e2b290b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.947253 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "97890884-a647-48e7-8e81-be963e2b290b" (UID: "97890884-a647-48e7-8e81-be963e2b290b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.977314 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-w9hqc"] Oct 13 18:31:04 crc kubenswrapper[4974]: W1013 18:31:04.984645 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f038500_ef77_401b_9e63_755ea9a56695.slice/crio-797a3a7a27493b94efa9329cd8ee28369ac5bab26df3bbe2a4741b605c44d619 WatchSource:0}: Error finding container 797a3a7a27493b94efa9329cd8ee28369ac5bab26df3bbe2a4741b605c44d619: Status 404 returned error can't find the container with id 797a3a7a27493b94efa9329cd8ee28369ac5bab26df3bbe2a4741b605c44d619 Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.993313 4974 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.993370 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.993381 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.993390 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:04 crc kubenswrapper[4974]: I1013 18:31:04.993398 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/97890884-a647-48e7-8e81-be963e2b290b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.124507 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"300c099d-eca4-4f0c-a79f-dde4dddd8a98","Type":"ContainerStarted","Data":"e7ae00adc6008de72d279c24b5e188e9839442bb1950329bd992e04f65d16807"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.126142 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-784fdb6789-wc4j6" event={"ID":"e2289a84-2fbc-42b5-884a-22f9134e8e15","Type":"ContainerStarted","Data":"099dc291413330a09b1a30f43e1c80451f937b4117b405ca45d1b037b652df60"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.129232 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w9hqc" event={"ID":"2f038500-ef77-401b-9e63-755ea9a56695","Type":"ContainerStarted","Data":"797a3a7a27493b94efa9329cd8ee28369ac5bab26df3bbe2a4741b605c44d619"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.130629 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-484wv" event={"ID":"49fb758b-0291-4449-aa49-7e191cc1b2dc","Type":"ContainerStarted","Data":"5db80e94ef9ef05328eeffeb6e5af6fb9728f58c3517938302089031313172a9"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.136556 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t64xt" event={"ID":"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f","Type":"ContainerStarted","Data":"4ae96bfa993eb5985f9b00374da0d3249ad1c982eea46fc5e9e966f77ec5a5c3"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.160279 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-t64xt" podStartSLOduration=3.160260571 podStartE2EDuration="3.160260571s" podCreationTimestamp="2025-10-13 18:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:05.152882363 +0000 UTC m=+1000.057248453" watchObservedRunningTime="2025-10-13 18:31:05.160260571 +0000 UTC m=+1000.064626651" Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.194060 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"13cd3ab5-2164-479f-bff9-8cac3b968f77","Type":"ContainerStarted","Data":"ddd48e1ded223fc45c7bc8682f76dcb969ebdb86c22ddba39bf43613f89dbd1c"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.194105 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"13cd3ab5-2164-479f-bff9-8cac3b968f77","Type":"ContainerStarted","Data":"dea6ea33a67638d3e93418686ba9e8edef84edb3bc693dac1abadf139ebb8578"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.195155 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.218900 4974 generic.go:334] "Generic (PLEG): container finished" podID="7ec9f25a-64d8-4904-b1df-03678b32a639" containerID="937ab733917d83140731d1032ed41ec628ebb7376d5e30cba5293727841a8821" exitCode=0 Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.219003 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" event={"ID":"7ec9f25a-64d8-4904-b1df-03678b32a639","Type":"ContainerDied","Data":"937ab733917d83140731d1032ed41ec628ebb7376d5e30cba5293727841a8821"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.219068 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" event={"ID":"7ec9f25a-64d8-4904-b1df-03678b32a639","Type":"ContainerStarted","Data":"26ad524b7b383bd640a52ef490e9fb8f0d3073fa9a1bd1bdee19624daf5fd88a"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.226962 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-755dd945cf-28mpc" Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.226974 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-755dd945cf-28mpc" event={"ID":"97890884-a647-48e7-8e81-be963e2b290b","Type":"ContainerDied","Data":"902fd3704ce67965ebc5718eab5a09fd6b2326b9cd28ca9b50aff6f8e18dcdb8"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.227029 4974 scope.go:117] "RemoveContainer" containerID="e2d45ec80db4d6f8cc278511141e2275a5c60059ee6484cca187886ab0ebd0a1" Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.228878 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.228858185 podStartE2EDuration="3.228858185s" podCreationTimestamp="2025-10-13 18:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:05.21305141 +0000 UTC m=+1000.117417480" watchObservedRunningTime="2025-10-13 18:31:05.228858185 +0000 UTC m=+1000.133224265" Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.262357 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcccd5645-2z74m" event={"ID":"1a480348-b0db-489e-be33-a93c1c6d311f","Type":"ContainerStarted","Data":"59130d1b7875579b8030802af7d30ef342a2d573bfd67c045e3031753c94b255"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.289420 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l2v67" event={"ID":"f29d4def-3da3-43a3-8331-f1ee4644dad2","Type":"ContainerStarted","Data":"a8ddb95503c6e81d03b2773af57e9348661c1e726c20422690017805867446fe"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.294829 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e","Type":"ContainerStarted","Data":"cd59784eb3d7b995313f040942c41bf27d4ae800a62031cb0c30345bda5f3a51"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.296470 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-njgsh" event={"ID":"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3","Type":"ContainerStarted","Data":"870a44faa56d8f5c1e61769f7df75b65be2735c3166c19127fada16ceebc1c05"} Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.298380 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-755dd945cf-28mpc"] Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.304253 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-755dd945cf-28mpc"] Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.879463 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97890884-a647-48e7-8e81-be963e2b290b" path="/var/lib/kubelet/pods/97890884-a647-48e7-8e81-be963e2b290b/volumes" Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.881213 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.895984 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fcccd5645-2z74m"] Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.907020 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.929469 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-865cd5d4d7-6kmth"] Oct 13 18:31:05 crc kubenswrapper[4974]: E1013 18:31:05.930058 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97890884-a647-48e7-8e81-be963e2b290b" containerName="init" Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.930133 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="97890884-a647-48e7-8e81-be963e2b290b" containerName="init" Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.930853 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="97890884-a647-48e7-8e81-be963e2b290b" containerName="init" Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.932187 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:05 crc kubenswrapper[4974]: I1013 18:31:05.967512 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-865cd5d4d7-6kmth"] Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.026967 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhg7p\" (UniqueName: \"kubernetes.io/projected/87f47666-886f-422d-99b5-607d95d84774-kube-api-access-zhg7p\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.027095 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/87f47666-886f-422d-99b5-607d95d84774-horizon-secret-key\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.027162 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87f47666-886f-422d-99b5-607d95d84774-scripts\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.027268 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87f47666-886f-422d-99b5-607d95d84774-config-data\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.027309 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87f47666-886f-422d-99b5-607d95d84774-logs\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.132016 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87f47666-886f-422d-99b5-607d95d84774-logs\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.129359 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87f47666-886f-422d-99b5-607d95d84774-logs\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.132499 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhg7p\" (UniqueName: \"kubernetes.io/projected/87f47666-886f-422d-99b5-607d95d84774-kube-api-access-zhg7p\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.132579 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/87f47666-886f-422d-99b5-607d95d84774-horizon-secret-key\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.132641 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87f47666-886f-422d-99b5-607d95d84774-scripts\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.132781 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87f47666-886f-422d-99b5-607d95d84774-config-data\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.137793 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87f47666-886f-422d-99b5-607d95d84774-config-data\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.138222 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87f47666-886f-422d-99b5-607d95d84774-scripts\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.170178 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/87f47666-886f-422d-99b5-607d95d84774-horizon-secret-key\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.170238 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhg7p\" (UniqueName: \"kubernetes.io/projected/87f47666-886f-422d-99b5-607d95d84774-kube-api-access-zhg7p\") pod \"horizon-865cd5d4d7-6kmth\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:06 crc kubenswrapper[4974]: I1013 18:31:06.256317 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:07 crc kubenswrapper[4974]: I1013 18:31:07.324320 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:31:07 crc kubenswrapper[4974]: I1013 18:31:07.324826 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api-log" containerID="cri-o://dea6ea33a67638d3e93418686ba9e8edef84edb3bc693dac1abadf139ebb8578" gracePeriod=30 Oct 13 18:31:07 crc kubenswrapper[4974]: I1013 18:31:07.324926 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api" containerID="cri-o://ddd48e1ded223fc45c7bc8682f76dcb969ebdb86c22ddba39bf43613f89dbd1c" gracePeriod=30 Oct 13 18:31:07 crc kubenswrapper[4974]: I1013 18:31:07.332393 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": EOF" Oct 13 18:31:07 crc kubenswrapper[4974]: I1013 18:31:07.742692 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:31:07 crc kubenswrapper[4974]: I1013 18:31:07.742979 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:31:07 crc kubenswrapper[4974]: I1013 18:31:07.969295 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 13 18:31:08 crc kubenswrapper[4974]: I1013 18:31:08.340973 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w9hqc" event={"ID":"2f038500-ef77-401b-9e63-755ea9a56695","Type":"ContainerStarted","Data":"852316323f978c82b4f75ca76cf975a0b30f46ec83deea56da2ba716de24ccd2"} Oct 13 18:31:08 crc kubenswrapper[4974]: I1013 18:31:08.343721 4974 generic.go:334] "Generic (PLEG): container finished" podID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerID="dea6ea33a67638d3e93418686ba9e8edef84edb3bc693dac1abadf139ebb8578" exitCode=143 Oct 13 18:31:08 crc kubenswrapper[4974]: I1013 18:31:08.343765 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"13cd3ab5-2164-479f-bff9-8cac3b968f77","Type":"ContainerDied","Data":"dea6ea33a67638d3e93418686ba9e8edef84edb3bc693dac1abadf139ebb8578"} Oct 13 18:31:08 crc kubenswrapper[4974]: I1013 18:31:08.359324 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-w9hqc" podStartSLOduration=5.359303277 podStartE2EDuration="5.359303277s" podCreationTimestamp="2025-10-13 18:31:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:08.352950878 +0000 UTC m=+1003.257316958" watchObservedRunningTime="2025-10-13 18:31:08.359303277 +0000 UTC m=+1003.263669367" Oct 13 18:31:09 crc kubenswrapper[4974]: I1013 18:31:09.085226 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": read tcp 10.217.0.2:34896->10.217.0.155:9322: read: connection reset by peer" Oct 13 18:31:09 crc kubenswrapper[4974]: I1013 18:31:09.086046 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": dial tcp 10.217.0.155:9322: connect: connection refused" Oct 13 18:31:09 crc kubenswrapper[4974]: I1013 18:31:09.357762 4974 generic.go:334] "Generic (PLEG): container finished" podID="ffb22f77-2d27-4de4-bdc6-0d17ba93e46f" containerID="4ae96bfa993eb5985f9b00374da0d3249ad1c982eea46fc5e9e966f77ec5a5c3" exitCode=0 Oct 13 18:31:09 crc kubenswrapper[4974]: I1013 18:31:09.357849 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t64xt" event={"ID":"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f","Type":"ContainerDied","Data":"4ae96bfa993eb5985f9b00374da0d3249ad1c982eea46fc5e9e966f77ec5a5c3"} Oct 13 18:31:09 crc kubenswrapper[4974]: I1013 18:31:09.367818 4974 generic.go:334] "Generic (PLEG): container finished" podID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerID="ddd48e1ded223fc45c7bc8682f76dcb969ebdb86c22ddba39bf43613f89dbd1c" exitCode=0 Oct 13 18:31:09 crc kubenswrapper[4974]: I1013 18:31:09.368487 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"13cd3ab5-2164-479f-bff9-8cac3b968f77","Type":"ContainerDied","Data":"ddd48e1ded223fc45c7bc8682f76dcb969ebdb86c22ddba39bf43613f89dbd1c"} Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.299222 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-784fdb6789-wc4j6"] Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.337952 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5796767b68-9dktc"] Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.339540 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.342030 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.347384 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5796767b68-9dktc"] Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.411118 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-865cd5d4d7-6kmth"] Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.448773 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-horizon-tls-certs\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.448824 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p7vvd\" (UniqueName: \"kubernetes.io/projected/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-kube-api-access-p7vvd\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.448870 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-config-data\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.448893 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-horizon-secret-key\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.448911 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-scripts\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.448977 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-logs\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.449008 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-combined-ca-bundle\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.455711 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6dcbf4cfcd-l89jc"] Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.457329 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.491718 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dcbf4cfcd-l89jc"] Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550355 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/003d2222-76eb-4a8c-b7c2-f201e88c542d-horizon-secret-key\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550429 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/003d2222-76eb-4a8c-b7c2-f201e88c542d-scripts\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550511 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/003d2222-76eb-4a8c-b7c2-f201e88c542d-horizon-tls-certs\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550584 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-logs\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550621 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/003d2222-76eb-4a8c-b7c2-f201e88c542d-config-data\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550670 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003d2222-76eb-4a8c-b7c2-f201e88c542d-logs\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550703 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-combined-ca-bundle\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550738 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-horizon-tls-certs\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550797 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p7vvd\" (UniqueName: \"kubernetes.io/projected/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-kube-api-access-p7vvd\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550829 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v455\" (UniqueName: \"kubernetes.io/projected/003d2222-76eb-4a8c-b7c2-f201e88c542d-kube-api-access-8v455\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550873 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003d2222-76eb-4a8c-b7c2-f201e88c542d-combined-ca-bundle\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550903 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-config-data\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550939 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-horizon-secret-key\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.550968 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-scripts\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.551933 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-scripts\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.552271 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-logs\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.556063 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-config-data\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.560017 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-combined-ca-bundle\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.581183 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-horizon-secret-key\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.587265 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p7vvd\" (UniqueName: \"kubernetes.io/projected/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-kube-api-access-p7vvd\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.587731 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-horizon-tls-certs\") pod \"horizon-5796767b68-9dktc\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.652275 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/003d2222-76eb-4a8c-b7c2-f201e88c542d-config-data\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.652320 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003d2222-76eb-4a8c-b7c2-f201e88c542d-logs\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.652379 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8v455\" (UniqueName: \"kubernetes.io/projected/003d2222-76eb-4a8c-b7c2-f201e88c542d-kube-api-access-8v455\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.652411 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003d2222-76eb-4a8c-b7c2-f201e88c542d-combined-ca-bundle\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.652442 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/003d2222-76eb-4a8c-b7c2-f201e88c542d-horizon-secret-key\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.652465 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/003d2222-76eb-4a8c-b7c2-f201e88c542d-scripts\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.652502 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/003d2222-76eb-4a8c-b7c2-f201e88c542d-horizon-tls-certs\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.652816 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/003d2222-76eb-4a8c-b7c2-f201e88c542d-logs\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.653202 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/003d2222-76eb-4a8c-b7c2-f201e88c542d-scripts\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.659348 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/003d2222-76eb-4a8c-b7c2-f201e88c542d-config-data\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.666322 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/003d2222-76eb-4a8c-b7c2-f201e88c542d-horizon-tls-certs\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.666877 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/003d2222-76eb-4a8c-b7c2-f201e88c542d-combined-ca-bundle\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.667163 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/003d2222-76eb-4a8c-b7c2-f201e88c542d-horizon-secret-key\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.673398 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.693351 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v455\" (UniqueName: \"kubernetes.io/projected/003d2222-76eb-4a8c-b7c2-f201e88c542d-kube-api-access-8v455\") pod \"horizon-6dcbf4cfcd-l89jc\" (UID: \"003d2222-76eb-4a8c-b7c2-f201e88c542d\") " pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:11 crc kubenswrapper[4974]: I1013 18:31:11.781731 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:17 crc kubenswrapper[4974]: I1013 18:31:17.969580 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.020360 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.196065 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4cvn\" (UniqueName: \"kubernetes.io/projected/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-kube-api-access-t4cvn\") pod \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.196172 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-fernet-keys\") pod \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.196316 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-combined-ca-bundle\") pod \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.196360 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-config-data\") pod \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.196399 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-scripts\") pod \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.196458 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-credential-keys\") pod \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\" (UID: \"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f\") " Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.203112 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ffb22f77-2d27-4de4-bdc6-0d17ba93e46f" (UID: "ffb22f77-2d27-4de4-bdc6-0d17ba93e46f"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.203764 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ffb22f77-2d27-4de4-bdc6-0d17ba93e46f" (UID: "ffb22f77-2d27-4de4-bdc6-0d17ba93e46f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.203796 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-scripts" (OuterVolumeSpecName: "scripts") pod "ffb22f77-2d27-4de4-bdc6-0d17ba93e46f" (UID: "ffb22f77-2d27-4de4-bdc6-0d17ba93e46f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.218213 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-kube-api-access-t4cvn" (OuterVolumeSpecName: "kube-api-access-t4cvn") pod "ffb22f77-2d27-4de4-bdc6-0d17ba93e46f" (UID: "ffb22f77-2d27-4de4-bdc6-0d17ba93e46f"). InnerVolumeSpecName "kube-api-access-t4cvn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.225136 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ffb22f77-2d27-4de4-bdc6-0d17ba93e46f" (UID: "ffb22f77-2d27-4de4-bdc6-0d17ba93e46f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.249234 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-config-data" (OuterVolumeSpecName: "config-data") pod "ffb22f77-2d27-4de4-bdc6-0d17ba93e46f" (UID: "ffb22f77-2d27-4de4-bdc6-0d17ba93e46f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.301333 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4cvn\" (UniqueName: \"kubernetes.io/projected/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-kube-api-access-t4cvn\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.301387 4974 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.301421 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.301439 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.301456 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.301472 4974 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.463163 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-t64xt" event={"ID":"ffb22f77-2d27-4de4-bdc6-0d17ba93e46f","Type":"ContainerDied","Data":"347de3a34256615fbdc9e2dba681fab193b19bb27ea0279a4a7553cd66e73e83"} Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.463199 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="347de3a34256615fbdc9e2dba681fab193b19bb27ea0279a4a7553cd66e73e83" Oct 13 18:31:19 crc kubenswrapper[4974]: I1013 18:31:19.463261 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-t64xt" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.155263 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-t64xt"] Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.167911 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-t64xt"] Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.237319 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-2q69p"] Oct 13 18:31:20 crc kubenswrapper[4974]: E1013 18:31:20.237867 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffb22f77-2d27-4de4-bdc6-0d17ba93e46f" containerName="keystone-bootstrap" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.237891 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffb22f77-2d27-4de4-bdc6-0d17ba93e46f" containerName="keystone-bootstrap" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.238111 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffb22f77-2d27-4de4-bdc6-0d17ba93e46f" containerName="keystone-bootstrap" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.238952 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.241889 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.242023 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.243168 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-788q9" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.248260 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.252242 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2q69p"] Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.422147 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-scripts\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.423363 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-combined-ca-bundle\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.423920 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-credential-keys\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.424234 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k25p\" (UniqueName: \"kubernetes.io/projected/af98b1ee-3954-4c18-9656-f61280f56b95-kube-api-access-2k25p\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.424525 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-fernet-keys\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.424823 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-config-data\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.526168 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k25p\" (UniqueName: \"kubernetes.io/projected/af98b1ee-3954-4c18-9656-f61280f56b95-kube-api-access-2k25p\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.526251 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-fernet-keys\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.526290 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-config-data\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.526438 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-scripts\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.526580 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-combined-ca-bundle\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.526686 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-credential-keys\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.534936 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-credential-keys\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.537606 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-fernet-keys\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.538228 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-scripts\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.538315 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-config-data\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.538898 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-combined-ca-bundle\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.566035 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k25p\" (UniqueName: \"kubernetes.io/projected/af98b1ee-3954-4c18-9656-f61280f56b95-kube-api-access-2k25p\") pod \"keystone-bootstrap-2q69p\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:20 crc kubenswrapper[4974]: I1013 18:31:20.860095 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:21 crc kubenswrapper[4974]: E1013 18:31:21.809079 4974 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Oct 13 18:31:21 crc kubenswrapper[4974]: E1013 18:31:21.809612 4974 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Oct 13 18:31:21 crc kubenswrapper[4974]: E1013 18:31:21.810815 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.119:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n564h587hcfh6chd8h5fbh597h8dhd7h5b9h64ch59fh598h686h5dfh7fh659h694h67ch685h5bchb6h64fh56ch54dhffh66fh67h5d9h595hb5h64cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sxrvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(300c099d-eca4-4f0c-a79f-dde4dddd8a98): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 18:31:21 crc kubenswrapper[4974]: I1013 18:31:21.842228 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffb22f77-2d27-4de4-bdc6-0d17ba93e46f" path="/var/lib/kubelet/pods/ffb22f77-2d27-4de4-bdc6-0d17ba93e46f/volumes" Oct 13 18:31:21 crc kubenswrapper[4974]: I1013 18:31:21.974516 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.158786 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13cd3ab5-2164-479f-bff9-8cac3b968f77-logs\") pod \"13cd3ab5-2164-479f-bff9-8cac3b968f77\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.159170 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-combined-ca-bundle\") pod \"13cd3ab5-2164-479f-bff9-8cac3b968f77\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.159303 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13cd3ab5-2164-479f-bff9-8cac3b968f77-logs" (OuterVolumeSpecName: "logs") pod "13cd3ab5-2164-479f-bff9-8cac3b968f77" (UID: "13cd3ab5-2164-479f-bff9-8cac3b968f77"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.159310 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-custom-prometheus-ca\") pod \"13cd3ab5-2164-479f-bff9-8cac3b968f77\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.159378 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8snqs\" (UniqueName: \"kubernetes.io/projected/13cd3ab5-2164-479f-bff9-8cac3b968f77-kube-api-access-8snqs\") pod \"13cd3ab5-2164-479f-bff9-8cac3b968f77\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.159477 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-config-data\") pod \"13cd3ab5-2164-479f-bff9-8cac3b968f77\" (UID: \"13cd3ab5-2164-479f-bff9-8cac3b968f77\") " Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.160015 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13cd3ab5-2164-479f-bff9-8cac3b968f77-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.163333 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13cd3ab5-2164-479f-bff9-8cac3b968f77-kube-api-access-8snqs" (OuterVolumeSpecName: "kube-api-access-8snqs") pod "13cd3ab5-2164-479f-bff9-8cac3b968f77" (UID: "13cd3ab5-2164-479f-bff9-8cac3b968f77"). InnerVolumeSpecName "kube-api-access-8snqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.190748 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "13cd3ab5-2164-479f-bff9-8cac3b968f77" (UID: "13cd3ab5-2164-479f-bff9-8cac3b968f77"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.194785 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13cd3ab5-2164-479f-bff9-8cac3b968f77" (UID: "13cd3ab5-2164-479f-bff9-8cac3b968f77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.246307 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-config-data" (OuterVolumeSpecName: "config-data") pod "13cd3ab5-2164-479f-bff9-8cac3b968f77" (UID: "13cd3ab5-2164-479f-bff9-8cac3b968f77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.261614 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.261764 4974 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.261786 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8snqs\" (UniqueName: \"kubernetes.io/projected/13cd3ab5-2164-479f-bff9-8cac3b968f77-kube-api-access-8snqs\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.261801 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13cd3ab5-2164-479f-bff9-8cac3b968f77-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:22 crc kubenswrapper[4974]: E1013 18:31:22.382559 4974 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Oct 13 18:31:22 crc kubenswrapper[4974]: E1013 18:31:22.382612 4974 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-barbican-api:watcher_latest" Oct 13 18:31:22 crc kubenswrapper[4974]: E1013 18:31:22.382745 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:38.102.83.119:5001/podified-master-centos10/openstack-barbican-api:watcher_latest,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bht4g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-l2v67_openstack(f29d4def-3da3-43a3-8331-f1ee4644dad2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 18:31:22 crc kubenswrapper[4974]: E1013 18:31:22.383949 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-l2v67" podUID="f29d4def-3da3-43a3-8331-f1ee4644dad2" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.506820 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"13cd3ab5-2164-479f-bff9-8cac3b968f77","Type":"ContainerDied","Data":"e8afe6effa0077347206cd48588be68b38feb70f365ae154c47d5ad1cd83f1c3"} Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.506868 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.506901 4974 scope.go:117] "RemoveContainer" containerID="ddd48e1ded223fc45c7bc8682f76dcb969ebdb86c22ddba39bf43613f89dbd1c" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.511865 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" event={"ID":"7ec9f25a-64d8-4904-b1df-03678b32a639","Type":"ContainerStarted","Data":"0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8"} Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.512296 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:22 crc kubenswrapper[4974]: E1013 18:31:22.513416 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.119:5001/podified-master-centos10/openstack-barbican-api:watcher_latest\\\"\"" pod="openstack/barbican-db-sync-l2v67" podUID="f29d4def-3da3-43a3-8331-f1ee4644dad2" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.552027 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" podStartSLOduration=20.552008152 podStartE2EDuration="20.552008152s" podCreationTimestamp="2025-10-13 18:31:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:22.548268976 +0000 UTC m=+1017.452635076" watchObservedRunningTime="2025-10-13 18:31:22.552008152 +0000 UTC m=+1017.456374242" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.573566 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.589051 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.597993 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:31:22 crc kubenswrapper[4974]: E1013 18:31:22.598397 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api-log" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.598414 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api-log" Oct 13 18:31:22 crc kubenswrapper[4974]: E1013 18:31:22.598436 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.598444 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.598629 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.598711 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api-log" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.599732 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.602937 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.622107 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.671136 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6bd0d0c-5130-4398-91e9-e96652bcae59-logs\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.671181 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.671224 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8g6t\" (UniqueName: \"kubernetes.io/projected/c6bd0d0c-5130-4398-91e9-e96652bcae59-kube-api-access-n8g6t\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.671379 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.671519 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-config-data\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.773071 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6bd0d0c-5130-4398-91e9-e96652bcae59-logs\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.773154 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.773235 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8g6t\" (UniqueName: \"kubernetes.io/projected/c6bd0d0c-5130-4398-91e9-e96652bcae59-kube-api-access-n8g6t\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.773338 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.773406 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-config-data\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.775266 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6bd0d0c-5130-4398-91e9-e96652bcae59-logs\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.779748 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.781439 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-config-data\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.782317 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.801109 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8g6t\" (UniqueName: \"kubernetes.io/projected/c6bd0d0c-5130-4398-91e9-e96652bcae59-kube-api-access-n8g6t\") pod \"watcher-api-0\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.920034 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 13 18:31:22 crc kubenswrapper[4974]: I1013 18:31:22.970453 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.155:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 18:31:23 crc kubenswrapper[4974]: I1013 18:31:23.821767 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13cd3ab5-2164-479f-bff9-8cac3b968f77" path="/var/lib/kubelet/pods/13cd3ab5-2164-479f-bff9-8cac3b968f77/volumes" Oct 13 18:31:27 crc kubenswrapper[4974]: I1013 18:31:27.556359 4974 generic.go:334] "Generic (PLEG): container finished" podID="2f038500-ef77-401b-9e63-755ea9a56695" containerID="852316323f978c82b4f75ca76cf975a0b30f46ec83deea56da2ba716de24ccd2" exitCode=0 Oct 13 18:31:27 crc kubenswrapper[4974]: I1013 18:31:27.556898 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w9hqc" event={"ID":"2f038500-ef77-401b-9e63-755ea9a56695","Type":"ContainerDied","Data":"852316323f978c82b4f75ca76cf975a0b30f46ec83deea56da2ba716de24ccd2"} Oct 13 18:31:28 crc kubenswrapper[4974]: I1013 18:31:28.379785 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:28 crc kubenswrapper[4974]: I1013 18:31:28.456304 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f67486d89-6nn4q"] Oct 13 18:31:28 crc kubenswrapper[4974]: I1013 18:31:28.456699 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" podUID="7a76ef05-7c82-4ef0-81c8-83fbef1d3496" containerName="dnsmasq-dns" containerID="cri-o://c07a1d964e61f6c66cf31897362b4e3ede991f8a0095ba18954f3162cd3a5e9b" gracePeriod=10 Oct 13 18:31:29 crc kubenswrapper[4974]: I1013 18:31:29.587910 4974 generic.go:334] "Generic (PLEG): container finished" podID="7a76ef05-7c82-4ef0-81c8-83fbef1d3496" containerID="c07a1d964e61f6c66cf31897362b4e3ede991f8a0095ba18954f3162cd3a5e9b" exitCode=0 Oct 13 18:31:29 crc kubenswrapper[4974]: I1013 18:31:29.588198 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" event={"ID":"7a76ef05-7c82-4ef0-81c8-83fbef1d3496","Type":"ContainerDied","Data":"c07a1d964e61f6c66cf31897362b4e3ede991f8a0095ba18954f3162cd3a5e9b"} Oct 13 18:31:30 crc kubenswrapper[4974]: I1013 18:31:30.070587 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" podUID="7a76ef05-7c82-4ef0-81c8-83fbef1d3496" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: connect: connection refused" Oct 13 18:31:35 crc kubenswrapper[4974]: I1013 18:31:35.070529 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" podUID="7a76ef05-7c82-4ef0-81c8-83fbef1d3496" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: connect: connection refused" Oct 13 18:31:37 crc kubenswrapper[4974]: I1013 18:31:37.743169 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:31:37 crc kubenswrapper[4974]: I1013 18:31:37.743533 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.071466 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" podUID="7a76ef05-7c82-4ef0-81c8-83fbef1d3496" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: connect: connection refused" Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.072079 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:31:40 crc kubenswrapper[4974]: E1013 18:31:40.411935 4974 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Oct 13 18:31:40 crc kubenswrapper[4974]: E1013 18:31:40.412028 4974 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-glance-api:watcher_latest" Oct 13 18:31:40 crc kubenswrapper[4974]: E1013 18:31:40.412267 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:38.102.83.119:5001/podified-master-centos10/openstack-glance-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gtwx7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-jgthj_openstack(ab455ee5-f8bd-4d4e-b179-524fa3edcc52): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 18:31:40 crc kubenswrapper[4974]: E1013 18:31:40.413715 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-jgthj" podUID="ab455ee5-f8bd-4d4e-b179-524fa3edcc52" Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.485121 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.560799 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f038500-ef77-401b-9e63-755ea9a56695-config\") pod \"2f038500-ef77-401b-9e63-755ea9a56695\" (UID: \"2f038500-ef77-401b-9e63-755ea9a56695\") " Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.560870 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f038500-ef77-401b-9e63-755ea9a56695-combined-ca-bundle\") pod \"2f038500-ef77-401b-9e63-755ea9a56695\" (UID: \"2f038500-ef77-401b-9e63-755ea9a56695\") " Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.560954 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmmvx\" (UniqueName: \"kubernetes.io/projected/2f038500-ef77-401b-9e63-755ea9a56695-kube-api-access-fmmvx\") pod \"2f038500-ef77-401b-9e63-755ea9a56695\" (UID: \"2f038500-ef77-401b-9e63-755ea9a56695\") " Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.575175 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f038500-ef77-401b-9e63-755ea9a56695-kube-api-access-fmmvx" (OuterVolumeSpecName: "kube-api-access-fmmvx") pod "2f038500-ef77-401b-9e63-755ea9a56695" (UID: "2f038500-ef77-401b-9e63-755ea9a56695"). InnerVolumeSpecName "kube-api-access-fmmvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.591692 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f038500-ef77-401b-9e63-755ea9a56695-config" (OuterVolumeSpecName: "config") pod "2f038500-ef77-401b-9e63-755ea9a56695" (UID: "2f038500-ef77-401b-9e63-755ea9a56695"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.608475 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f038500-ef77-401b-9e63-755ea9a56695-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f038500-ef77-401b-9e63-755ea9a56695" (UID: "2f038500-ef77-401b-9e63-755ea9a56695"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.665597 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/2f038500-ef77-401b-9e63-755ea9a56695-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.666341 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f038500-ef77-401b-9e63-755ea9a56695-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.666366 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmmvx\" (UniqueName: \"kubernetes.io/projected/2f038500-ef77-401b-9e63-755ea9a56695-kube-api-access-fmmvx\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.695842 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-w9hqc" Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.696164 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-w9hqc" event={"ID":"2f038500-ef77-401b-9e63-755ea9a56695","Type":"ContainerDied","Data":"797a3a7a27493b94efa9329cd8ee28369ac5bab26df3bbe2a4741b605c44d619"} Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.696193 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="797a3a7a27493b94efa9329cd8ee28369ac5bab26df3bbe2a4741b605c44d619" Oct 13 18:31:40 crc kubenswrapper[4974]: E1013 18:31:40.698585 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.119:5001/podified-master-centos10/openstack-glance-api:watcher_latest\\\"\"" pod="openstack/glance-db-sync-jgthj" podUID="ab455ee5-f8bd-4d4e-b179-524fa3edcc52" Oct 13 18:31:40 crc kubenswrapper[4974]: I1013 18:31:40.810548 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-865cd5d4d7-6kmth"] Oct 13 18:31:41 crc kubenswrapper[4974]: E1013 18:31:41.454329 4974 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 13 18:31:41 crc kubenswrapper[4974]: E1013 18:31:41.454390 4974 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Oct 13 18:31:41 crc kubenswrapper[4974]: E1013 18:31:41.454542 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.119:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zbb85,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-njgsh_openstack(bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 18:31:41 crc kubenswrapper[4974]: E1013 18:31:41.455737 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-njgsh" podUID="bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.463969 4974 scope.go:117] "RemoveContainer" containerID="dea6ea33a67638d3e93418686ba9e8edef84edb3bc693dac1abadf139ebb8578" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.557561 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.590901 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhts5\" (UniqueName: \"kubernetes.io/projected/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-kube-api-access-dhts5\") pod \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.591015 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-dns-swift-storage-0\") pod \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.591043 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-ovsdbserver-sb\") pod \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.591080 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-dns-svc\") pod \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.591141 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-config\") pod \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.591174 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-ovsdbserver-nb\") pod \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\" (UID: \"7a76ef05-7c82-4ef0-81c8-83fbef1d3496\") " Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.596839 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-kube-api-access-dhts5" (OuterVolumeSpecName: "kube-api-access-dhts5") pod "7a76ef05-7c82-4ef0-81c8-83fbef1d3496" (UID: "7a76ef05-7c82-4ef0-81c8-83fbef1d3496"). InnerVolumeSpecName "kube-api-access-dhts5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.665827 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7a76ef05-7c82-4ef0-81c8-83fbef1d3496" (UID: "7a76ef05-7c82-4ef0-81c8-83fbef1d3496"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.676176 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7a76ef05-7c82-4ef0-81c8-83fbef1d3496" (UID: "7a76ef05-7c82-4ef0-81c8-83fbef1d3496"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.689711 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7a76ef05-7c82-4ef0-81c8-83fbef1d3496" (UID: "7a76ef05-7c82-4ef0-81c8-83fbef1d3496"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.692508 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7a76ef05-7c82-4ef0-81c8-83fbef1d3496" (UID: "7a76ef05-7c82-4ef0-81c8-83fbef1d3496"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.694901 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhts5\" (UniqueName: \"kubernetes.io/projected/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-kube-api-access-dhts5\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.694925 4974 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.694935 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.694944 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.694953 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.702100 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-config" (OuterVolumeSpecName: "config") pod "7a76ef05-7c82-4ef0-81c8-83fbef1d3496" (UID: "7a76ef05-7c82-4ef0-81c8-83fbef1d3496"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.721505 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" event={"ID":"7a76ef05-7c82-4ef0-81c8-83fbef1d3496","Type":"ContainerDied","Data":"58b0ee964d0f1531c693f956febc25435a97ed38703ed4c73938b00805af201c"} Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.721560 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f67486d89-6nn4q" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.728638 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-865cd5d4d7-6kmth" event={"ID":"87f47666-886f-422d-99b5-607d95d84774","Type":"ContainerStarted","Data":"066b37dc8014b7712dcf41b0c7df96eb4ec50feb8f127fce751cc20aad42b531"} Oct 13 18:31:41 crc kubenswrapper[4974]: E1013 18:31:41.732784 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.119:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-njgsh" podUID="bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.734000 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6dcbf4cfcd-l89jc"] Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.809886 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-549f94c95-vzgxw"] Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.813136 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a76ef05-7c82-4ef0-81c8-83fbef1d3496-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:41 crc kubenswrapper[4974]: E1013 18:31:41.818311 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a76ef05-7c82-4ef0-81c8-83fbef1d3496" containerName="init" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.818345 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a76ef05-7c82-4ef0-81c8-83fbef1d3496" containerName="init" Oct 13 18:31:41 crc kubenswrapper[4974]: E1013 18:31:41.818369 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a76ef05-7c82-4ef0-81c8-83fbef1d3496" containerName="dnsmasq-dns" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.818376 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a76ef05-7c82-4ef0-81c8-83fbef1d3496" containerName="dnsmasq-dns" Oct 13 18:31:41 crc kubenswrapper[4974]: E1013 18:31:41.818404 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f038500-ef77-401b-9e63-755ea9a56695" containerName="neutron-db-sync" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.818466 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f038500-ef77-401b-9e63-755ea9a56695" containerName="neutron-db-sync" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.829277 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a76ef05-7c82-4ef0-81c8-83fbef1d3496" containerName="dnsmasq-dns" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.831133 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f038500-ef77-401b-9e63-755ea9a56695" containerName="neutron-db-sync" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.856010 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.883379 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-549f94c95-vzgxw"] Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.892284 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f67486d89-6nn4q"] Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.908580 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f67486d89-6nn4q"] Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.915951 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-579fddb58d-n5xbk"] Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.918268 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.920149 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hsz8q" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.920211 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.920471 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.922838 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Oct 13 18:31:41 crc kubenswrapper[4974]: I1013 18:31:41.925265 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-579fddb58d-n5xbk"] Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.017138 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-config\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.017333 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-dns-swift-storage-0\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.017407 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd5fz\" (UniqueName: \"kubernetes.io/projected/dec721b6-7daa-4481-8e76-df9054f32f97-kube-api-access-hd5fz\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.017498 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-ovndb-tls-certs\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.017600 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-ovsdbserver-sb\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.017769 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-httpd-config\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.017857 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4cnf\" (UniqueName: \"kubernetes.io/projected/1fec344f-2e1a-4913-a239-c891d311e830-kube-api-access-n4cnf\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.017887 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-config\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.018020 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-combined-ca-bundle\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.022492 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-ovsdbserver-nb\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.023696 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-dns-svc\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.045387 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5796767b68-9dktc"] Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.126171 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-dns-svc\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.126233 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-config\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.126285 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-dns-swift-storage-0\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.126309 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd5fz\" (UniqueName: \"kubernetes.io/projected/dec721b6-7daa-4481-8e76-df9054f32f97-kube-api-access-hd5fz\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.126339 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-ovndb-tls-certs\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.126373 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-ovsdbserver-sb\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.126401 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-httpd-config\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.126431 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4cnf\" (UniqueName: \"kubernetes.io/projected/1fec344f-2e1a-4913-a239-c891d311e830-kube-api-access-n4cnf\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.126453 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-config\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.126488 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-combined-ca-bundle\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.126534 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-ovsdbserver-nb\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.128541 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-ovsdbserver-nb\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.128805 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-config\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.128885 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-dns-swift-storage-0\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.129483 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-ovsdbserver-sb\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.132806 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-combined-ca-bundle\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.133880 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-dns-svc\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.133980 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-httpd-config\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.135543 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-config\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.140226 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-ovndb-tls-certs\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.142781 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4cnf\" (UniqueName: \"kubernetes.io/projected/1fec344f-2e1a-4913-a239-c891d311e830-kube-api-access-n4cnf\") pod \"dnsmasq-dns-549f94c95-vzgxw\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.146775 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd5fz\" (UniqueName: \"kubernetes.io/projected/dec721b6-7daa-4481-8e76-df9054f32f97-kube-api-access-hd5fz\") pod \"neutron-579fddb58d-n5xbk\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: E1013 18:31:42.151897 4974 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-ceilometer-notification:watcher_latest" Oct 13 18:31:42 crc kubenswrapper[4974]: E1013 18:31:42.151958 4974 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.119:5001/podified-master-centos10/openstack-ceilometer-notification:watcher_latest" Oct 13 18:31:42 crc kubenswrapper[4974]: E1013 18:31:42.152075 4974 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-notification-agent,Image:38.102.83.119:5001/podified-master-centos10/openstack-ceilometer-notification:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n564h587hcfh6chd8h5fbh597h8dhd7h5b9h64ch59fh598h686h5dfh7fh659h694h67ch685h5bchb6h64fh56ch54dhffh66fh67h5d9h595hb5h64cq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-notification-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sxrvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/notificationhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(300c099d-eca4-4f0c-a79f-dde4dddd8a98): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Oct 13 18:31:42 crc kubenswrapper[4974]: W1013 18:31:42.163480 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9b88447_1cdb_4666_a4c2_31b7a0e7192f.slice/crio-3a4cbb1020ef81130d0f39ec5b7a88db5ee0782152c9a294fd20fd56beba8a46 WatchSource:0}: Error finding container 3a4cbb1020ef81130d0f39ec5b7a88db5ee0782152c9a294fd20fd56beba8a46: Status 404 returned error can't find the container with id 3a4cbb1020ef81130d0f39ec5b7a88db5ee0782152c9a294fd20fd56beba8a46 Oct 13 18:31:42 crc kubenswrapper[4974]: W1013 18:31:42.168857 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod003d2222_76eb_4a8c_b7c2_f201e88c542d.slice/crio-a83eb91d1d0b4364f7fbc757685944793263a8d87e7a61869baaf048556d018c WatchSource:0}: Error finding container a83eb91d1d0b4364f7fbc757685944793263a8d87e7a61869baaf048556d018c: Status 404 returned error can't find the container with id a83eb91d1d0b4364f7fbc757685944793263a8d87e7a61869baaf048556d018c Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.175422 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.219146 4974 scope.go:117] "RemoveContainer" containerID="c07a1d964e61f6c66cf31897362b4e3ede991f8a0095ba18954f3162cd3a5e9b" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.245662 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.313383 4974 scope.go:117] "RemoveContainer" containerID="94333711a6bedf7157f1d6d764a66a8412281b8ae658b77f6493ba3ff78f8636" Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.691448 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-2q69p"] Oct 13 18:31:42 crc kubenswrapper[4974]: W1013 18:31:42.706841 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf98b1ee_3954_4c18_9656_f61280f56b95.slice/crio-bcbf368e16ea60b4ea1417514070ea11e79e2c88aa2c7e65e4ab4bd1f554fb3b WatchSource:0}: Error finding container bcbf368e16ea60b4ea1417514070ea11e79e2c88aa2c7e65e4ab4bd1f554fb3b: Status 404 returned error can't find the container with id bcbf368e16ea60b4ea1417514070ea11e79e2c88aa2c7e65e4ab4bd1f554fb3b Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.709761 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.746472 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dcbf4cfcd-l89jc" event={"ID":"003d2222-76eb-4a8c-b7c2-f201e88c542d","Type":"ContainerStarted","Data":"a83eb91d1d0b4364f7fbc757685944793263a8d87e7a61869baaf048556d018c"} Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.757404 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5796767b68-9dktc" event={"ID":"a9b88447-1cdb-4666-a4c2-31b7a0e7192f","Type":"ContainerStarted","Data":"3a4cbb1020ef81130d0f39ec5b7a88db5ee0782152c9a294fd20fd56beba8a46"} Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.759084 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2q69p" event={"ID":"af98b1ee-3954-4c18-9656-f61280f56b95","Type":"ContainerStarted","Data":"bcbf368e16ea60b4ea1417514070ea11e79e2c88aa2c7e65e4ab4bd1f554fb3b"} Oct 13 18:31:42 crc kubenswrapper[4974]: I1013 18:31:42.885693 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-549f94c95-vzgxw"] Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.322896 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-579fddb58d-n5xbk"] Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.772340 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-579fddb58d-n5xbk" event={"ID":"dec721b6-7daa-4481-8e76-df9054f32f97","Type":"ContainerStarted","Data":"ac14dfe3d673da1c43de609033de25c209c82cf1ade6ae877d97d64706cd34ab"} Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.774486 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-865cd5d4d7-6kmth" event={"ID":"87f47666-886f-422d-99b5-607d95d84774","Type":"ContainerStarted","Data":"adf7a89af60f2f7ef0da4065d108699bd3d59983e4dc22e64406a5aad69652dc"} Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.782434 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-784fdb6789-wc4j6" event={"ID":"e2289a84-2fbc-42b5-884a-22f9134e8e15","Type":"ContainerStarted","Data":"a55b39f55821b2a80c3742aa351267ee7f94e96e5043681512efae6f920f584a"} Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.802742 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l2v67" event={"ID":"f29d4def-3da3-43a3-8331-f1ee4644dad2","Type":"ContainerStarted","Data":"eef82df6ce6cd9ca241efc80766eea2428b0a3c27c2c94a2478846595e58ac9a"} Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.836183 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a76ef05-7c82-4ef0-81c8-83fbef1d3496" path="/var/lib/kubelet/pods/7a76ef05-7c82-4ef0-81c8-83fbef1d3496/volumes" Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.838938 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" event={"ID":"1fec344f-2e1a-4913-a239-c891d311e830","Type":"ContainerStarted","Data":"cee2e569c0f2e92c7f1667bb862547dd18338bd1680d93cfb5d276300207cf04"} Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.849970 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"751ed806-3991-4b72-91fb-ff5d56c66849","Type":"ContainerStarted","Data":"9f3538a3336ad09da6e2d412fce7099b1f11061277d16c5d59fe29b11072dbbc"} Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.855427 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-l2v67" podStartSLOduration=3.213103139 podStartE2EDuration="40.855407093s" podCreationTimestamp="2025-10-13 18:31:03 +0000 UTC" firstStartedPulling="2025-10-13 18:31:04.779403492 +0000 UTC m=+999.683769572" lastFinishedPulling="2025-10-13 18:31:42.421707446 +0000 UTC m=+1037.326073526" observedRunningTime="2025-10-13 18:31:43.833637509 +0000 UTC m=+1038.738003589" watchObservedRunningTime="2025-10-13 18:31:43.855407093 +0000 UTC m=+1038.759773173" Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.872050 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=23.863331384 podStartE2EDuration="41.872031472s" podCreationTimestamp="2025-10-13 18:31:02 +0000 UTC" firstStartedPulling="2025-10-13 18:31:03.832151781 +0000 UTC m=+998.736517861" lastFinishedPulling="2025-10-13 18:31:21.840851829 +0000 UTC m=+1016.745217949" observedRunningTime="2025-10-13 18:31:43.868003648 +0000 UTC m=+1038.772369738" watchObservedRunningTime="2025-10-13 18:31:43.872031472 +0000 UTC m=+1038.776397542" Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.889887 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dcbf4cfcd-l89jc" event={"ID":"003d2222-76eb-4a8c-b7c2-f201e88c542d","Type":"ContainerStarted","Data":"b26e2ebebb21c94bdcd5019cb45ce556c9e0df6c67cd90d87d7c0ef9fcef0790"} Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.918991 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5796767b68-9dktc" event={"ID":"a9b88447-1cdb-4666-a4c2-31b7a0e7192f","Type":"ContainerStarted","Data":"09cc0d62a371439bdad9579ae7fbafbb030801916febc2419d1d87c924749bdc"} Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.930878 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e","Type":"ContainerStarted","Data":"69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90"} Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.949679 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-484wv" event={"ID":"49fb758b-0291-4449-aa49-7e191cc1b2dc","Type":"ContainerStarted","Data":"eb645210fcad7a793e788506a5f516ba05c4c3d70cb3589162f560823eeb8cfb"} Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.962054 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=23.689108772 podStartE2EDuration="41.96203841s" podCreationTimestamp="2025-10-13 18:31:02 +0000 UTC" firstStartedPulling="2025-10-13 18:31:04.097841923 +0000 UTC m=+999.002208043" lastFinishedPulling="2025-10-13 18:31:22.370771601 +0000 UTC m=+1017.275137681" observedRunningTime="2025-10-13 18:31:43.950029381 +0000 UTC m=+1038.854395461" watchObservedRunningTime="2025-10-13 18:31:43.96203841 +0000 UTC m=+1038.866404490" Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.967875 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-484wv" podStartSLOduration=5.869242949 podStartE2EDuration="41.967861324s" podCreationTimestamp="2025-10-13 18:31:02 +0000 UTC" firstStartedPulling="2025-10-13 18:31:04.259336157 +0000 UTC m=+999.163702227" lastFinishedPulling="2025-10-13 18:31:40.357954492 +0000 UTC m=+1035.262320602" observedRunningTime="2025-10-13 18:31:43.96525974 +0000 UTC m=+1038.869625820" watchObservedRunningTime="2025-10-13 18:31:43.967861324 +0000 UTC m=+1038.872227404" Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.970770 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2q69p" event={"ID":"af98b1ee-3954-4c18-9656-f61280f56b95","Type":"ContainerStarted","Data":"dee013a4e07ee8ac5a0f2cd484903875bfd6fcb36664a869168c4b0a3a09dc9e"} Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.977314 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c6bd0d0c-5130-4398-91e9-e96652bcae59","Type":"ContainerStarted","Data":"3991c5ce67933228abbdade41680d3fa186d322e2a0ef27f9e529aa4b15eb37f"} Oct 13 18:31:43 crc kubenswrapper[4974]: I1013 18:31:43.977349 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c6bd0d0c-5130-4398-91e9-e96652bcae59","Type":"ContainerStarted","Data":"fb6f6877e16f249f11a5003065c97736a8a6a516dcd76fd0feaa71aa98b00f67"} Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.002858 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-2q69p" podStartSLOduration=24.00283802 podStartE2EDuration="24.00283802s" podCreationTimestamp="2025-10-13 18:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:43.997975403 +0000 UTC m=+1038.902341493" watchObservedRunningTime="2025-10-13 18:31:44.00283802 +0000 UTC m=+1038.907204100" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.085426 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8475fc656f-dnpll"] Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.087028 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.091092 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.091327 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.123575 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8475fc656f-dnpll"] Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.205938 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-combined-ca-bundle\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.205990 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-public-tls-certs\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.206080 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-ovndb-tls-certs\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.206103 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-internal-tls-certs\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.206124 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-config\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.206159 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-httpd-config\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.206207 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9pwj\" (UniqueName: \"kubernetes.io/projected/59a33676-139f-4010-ab9a-25832163ab83-kube-api-access-k9pwj\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.307681 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-ovndb-tls-certs\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.307744 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-internal-tls-certs\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.307787 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-config\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.307855 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-httpd-config\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.307937 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9pwj\" (UniqueName: \"kubernetes.io/projected/59a33676-139f-4010-ab9a-25832163ab83-kube-api-access-k9pwj\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.307979 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-combined-ca-bundle\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.308014 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-public-tls-certs\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.336409 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-httpd-config\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.336409 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9pwj\" (UniqueName: \"kubernetes.io/projected/59a33676-139f-4010-ab9a-25832163ab83-kube-api-access-k9pwj\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.343704 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-public-tls-certs\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.344434 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-internal-tls-certs\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.354289 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-ovndb-tls-certs\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.356100 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-config\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.365919 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59a33676-139f-4010-ab9a-25832163ab83-combined-ca-bundle\") pod \"neutron-8475fc656f-dnpll\" (UID: \"59a33676-139f-4010-ab9a-25832163ab83\") " pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:44 crc kubenswrapper[4974]: I1013 18:31:44.451813 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.003864 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5796767b68-9dktc" event={"ID":"a9b88447-1cdb-4666-a4c2-31b7a0e7192f","Type":"ContainerStarted","Data":"c71f94d566debc81c03ff7c3180f22e17d2ee5689e9730d35d2824e22263bd4e"} Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.013208 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-784fdb6789-wc4j6" event={"ID":"e2289a84-2fbc-42b5-884a-22f9134e8e15","Type":"ContainerStarted","Data":"6190c9d6b685824a6eb6200399691cc50f068b01e3d085e4e8cfcdf7ca07a66e"} Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.013365 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-784fdb6789-wc4j6" podUID="e2289a84-2fbc-42b5-884a-22f9134e8e15" containerName="horizon-log" containerID="cri-o://a55b39f55821b2a80c3742aa351267ee7f94e96e5043681512efae6f920f584a" gracePeriod=30 Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.013440 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-784fdb6789-wc4j6" podUID="e2289a84-2fbc-42b5-884a-22f9134e8e15" containerName="horizon" containerID="cri-o://6190c9d6b685824a6eb6200399691cc50f068b01e3d085e4e8cfcdf7ca07a66e" gracePeriod=30 Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.028842 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcccd5645-2z74m" event={"ID":"1a480348-b0db-489e-be33-a93c1c6d311f","Type":"ContainerStarted","Data":"a2b12522dd03f99a07d5f8437c0839e57c2faebbb59c8d1ab0c2f6e558795e04"} Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.028886 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcccd5645-2z74m" event={"ID":"1a480348-b0db-489e-be33-a93c1c6d311f","Type":"ContainerStarted","Data":"d744872441ac27f8ea33b79021be8dc48a9a772584c8985c6bc37c67006261c3"} Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.029023 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fcccd5645-2z74m" podUID="1a480348-b0db-489e-be33-a93c1c6d311f" containerName="horizon-log" containerID="cri-o://d744872441ac27f8ea33b79021be8dc48a9a772584c8985c6bc37c67006261c3" gracePeriod=30 Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.029063 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5fcccd5645-2z74m" podUID="1a480348-b0db-489e-be33-a93c1c6d311f" containerName="horizon" containerID="cri-o://a2b12522dd03f99a07d5f8437c0839e57c2faebbb59c8d1ab0c2f6e558795e04" gracePeriod=30 Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.035766 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c6bd0d0c-5130-4398-91e9-e96652bcae59","Type":"ContainerStarted","Data":"029f51b6c8020fdfde3aa8a4b4c24f4b5f134cb93da0a7b729b1880376f917b7"} Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.035964 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5796767b68-9dktc" podStartSLOduration=34.035942852 podStartE2EDuration="34.035942852s" podCreationTimestamp="2025-10-13 18:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:45.023804869 +0000 UTC m=+1039.928170969" watchObservedRunningTime="2025-10-13 18:31:45.035942852 +0000 UTC m=+1039.940308932" Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.036735 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.044166 4974 generic.go:334] "Generic (PLEG): container finished" podID="1fec344f-2e1a-4913-a239-c891d311e830" containerID="e3dab4edeab41c1ba53393c31f9813138fa97182384ef3583e979f98e808399d" exitCode=0 Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.045016 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" event={"ID":"1fec344f-2e1a-4913-a239-c891d311e830","Type":"ContainerDied","Data":"e3dab4edeab41c1ba53393c31f9813138fa97182384ef3583e979f98e808399d"} Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.053554 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-784fdb6789-wc4j6" podStartSLOduration=5.130483817 podStartE2EDuration="43.053537968s" podCreationTimestamp="2025-10-13 18:31:02 +0000 UTC" firstStartedPulling="2025-10-13 18:31:04.262740743 +0000 UTC m=+999.167106813" lastFinishedPulling="2025-10-13 18:31:42.185794884 +0000 UTC m=+1037.090160964" observedRunningTime="2025-10-13 18:31:45.042785415 +0000 UTC m=+1039.947151495" watchObservedRunningTime="2025-10-13 18:31:45.053537968 +0000 UTC m=+1039.957904048" Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.070424 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6dcbf4cfcd-l89jc" event={"ID":"003d2222-76eb-4a8c-b7c2-f201e88c542d","Type":"ContainerStarted","Data":"1f7a9eb09304345e3cae0030d0c75ce89ae06b43b947eee73c93ac39a8dd6a67"} Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.084895 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-579fddb58d-n5xbk" event={"ID":"dec721b6-7daa-4481-8e76-df9054f32f97","Type":"ContainerStarted","Data":"0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25"} Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.092163 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5fcccd5645-2z74m" podStartSLOduration=5.461024548 podStartE2EDuration="43.092144477s" podCreationTimestamp="2025-10-13 18:31:02 +0000 UTC" firstStartedPulling="2025-10-13 18:31:04.598639474 +0000 UTC m=+999.503005554" lastFinishedPulling="2025-10-13 18:31:42.229759403 +0000 UTC m=+1037.134125483" observedRunningTime="2025-10-13 18:31:45.070149206 +0000 UTC m=+1039.974515286" watchObservedRunningTime="2025-10-13 18:31:45.092144477 +0000 UTC m=+1039.996510557" Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.095475 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=23.09546482 podStartE2EDuration="23.09546482s" podCreationTimestamp="2025-10-13 18:31:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:45.091338544 +0000 UTC m=+1039.995704624" watchObservedRunningTime="2025-10-13 18:31:45.09546482 +0000 UTC m=+1039.999830900" Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.102635 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-865cd5d4d7-6kmth" podUID="87f47666-886f-422d-99b5-607d95d84774" containerName="horizon-log" containerID="cri-o://adf7a89af60f2f7ef0da4065d108699bd3d59983e4dc22e64406a5aad69652dc" gracePeriod=30 Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.102786 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-865cd5d4d7-6kmth" event={"ID":"87f47666-886f-422d-99b5-607d95d84774","Type":"ContainerStarted","Data":"86740bc2eb3d5cd5632e4fa5352a768203ef4909fd0a60731da3326e2c249a71"} Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.104366 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-865cd5d4d7-6kmth" podUID="87f47666-886f-422d-99b5-607d95d84774" containerName="horizon" containerID="cri-o://86740bc2eb3d5cd5632e4fa5352a768203ef4909fd0a60731da3326e2c249a71" gracePeriod=30 Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.144445 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6dcbf4cfcd-l89jc" podStartSLOduration=34.144425861 podStartE2EDuration="34.144425861s" podCreationTimestamp="2025-10-13 18:31:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:45.138300958 +0000 UTC m=+1040.042667038" watchObservedRunningTime="2025-10-13 18:31:45.144425861 +0000 UTC m=+1040.048791941" Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.175529 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-865cd5d4d7-6kmth" podStartSLOduration=40.175512417 podStartE2EDuration="40.175512417s" podCreationTimestamp="2025-10-13 18:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:45.164690792 +0000 UTC m=+1040.069056882" watchObservedRunningTime="2025-10-13 18:31:45.175512417 +0000 UTC m=+1040.079878497" Oct 13 18:31:45 crc kubenswrapper[4974]: I1013 18:31:45.239293 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8475fc656f-dnpll"] Oct 13 18:31:46 crc kubenswrapper[4974]: I1013 18:31:46.117003 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-579fddb58d-n5xbk" event={"ID":"dec721b6-7daa-4481-8e76-df9054f32f97","Type":"ContainerStarted","Data":"0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548"} Oct 13 18:31:46 crc kubenswrapper[4974]: I1013 18:31:46.119467 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:31:46 crc kubenswrapper[4974]: I1013 18:31:46.122546 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8475fc656f-dnpll" event={"ID":"59a33676-139f-4010-ab9a-25832163ab83","Type":"ContainerStarted","Data":"7266a64bed8d35c1dd4fc4c560b6d44c101714a77ef830a4aec14dbe8dd75cc3"} Oct 13 18:31:46 crc kubenswrapper[4974]: I1013 18:31:46.122578 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8475fc656f-dnpll" event={"ID":"59a33676-139f-4010-ab9a-25832163ab83","Type":"ContainerStarted","Data":"945d85e822d9b22f66fd311095b59f0c19886a8c4a07feb882caed9c2d67a869"} Oct 13 18:31:46 crc kubenswrapper[4974]: I1013 18:31:46.122589 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8475fc656f-dnpll" event={"ID":"59a33676-139f-4010-ab9a-25832163ab83","Type":"ContainerStarted","Data":"b9944b8768cf21a854a972cc6c8541ce120619d03e182f11002f95c354299e02"} Oct 13 18:31:46 crc kubenswrapper[4974]: I1013 18:31:46.123208 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:31:46 crc kubenswrapper[4974]: I1013 18:31:46.131669 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" event={"ID":"1fec344f-2e1a-4913-a239-c891d311e830","Type":"ContainerStarted","Data":"4a9bc4ff2f672bacc4a8f4f874949ccc353370d3bdef6a3474382ba992d8b89b"} Oct 13 18:31:46 crc kubenswrapper[4974]: I1013 18:31:46.131709 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:46 crc kubenswrapper[4974]: I1013 18:31:46.145488 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-579fddb58d-n5xbk" podStartSLOduration=5.145468227 podStartE2EDuration="5.145468227s" podCreationTimestamp="2025-10-13 18:31:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:46.137733579 +0000 UTC m=+1041.042099659" watchObservedRunningTime="2025-10-13 18:31:46.145468227 +0000 UTC m=+1041.049834307" Oct 13 18:31:46 crc kubenswrapper[4974]: I1013 18:31:46.169740 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" podStartSLOduration=5.169715351 podStartE2EDuration="5.169715351s" podCreationTimestamp="2025-10-13 18:31:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:46.153455963 +0000 UTC m=+1041.057822043" watchObservedRunningTime="2025-10-13 18:31:46.169715351 +0000 UTC m=+1041.074081451" Oct 13 18:31:46 crc kubenswrapper[4974]: I1013 18:31:46.183692 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8475fc656f-dnpll" podStartSLOduration=2.183675935 podStartE2EDuration="2.183675935s" podCreationTimestamp="2025-10-13 18:31:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:46.176820081 +0000 UTC m=+1041.081186171" watchObservedRunningTime="2025-10-13 18:31:46.183675935 +0000 UTC m=+1041.088042015" Oct 13 18:31:46 crc kubenswrapper[4974]: I1013 18:31:46.257879 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:31:47 crc kubenswrapper[4974]: I1013 18:31:47.138067 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:31:47 crc kubenswrapper[4974]: I1013 18:31:47.722383 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 13 18:31:47 crc kubenswrapper[4974]: I1013 18:31:47.920685 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 13 18:31:48 crc kubenswrapper[4974]: I1013 18:31:48.094478 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 13 18:31:50 crc kubenswrapper[4974]: I1013 18:31:50.170278 4974 generic.go:334] "Generic (PLEG): container finished" podID="49fb758b-0291-4449-aa49-7e191cc1b2dc" containerID="eb645210fcad7a793e788506a5f516ba05c4c3d70cb3589162f560823eeb8cfb" exitCode=0 Oct 13 18:31:50 crc kubenswrapper[4974]: I1013 18:31:50.170932 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-484wv" event={"ID":"49fb758b-0291-4449-aa49-7e191cc1b2dc","Type":"ContainerDied","Data":"eb645210fcad7a793e788506a5f516ba05c4c3d70cb3589162f560823eeb8cfb"} Oct 13 18:31:50 crc kubenswrapper[4974]: I1013 18:31:50.173372 4974 generic.go:334] "Generic (PLEG): container finished" podID="af98b1ee-3954-4c18-9656-f61280f56b95" containerID="dee013a4e07ee8ac5a0f2cd484903875bfd6fcb36664a869168c4b0a3a09dc9e" exitCode=0 Oct 13 18:31:50 crc kubenswrapper[4974]: I1013 18:31:50.173475 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2q69p" event={"ID":"af98b1ee-3954-4c18-9656-f61280f56b95","Type":"ContainerDied","Data":"dee013a4e07ee8ac5a0f2cd484903875bfd6fcb36664a869168c4b0a3a09dc9e"} Oct 13 18:31:50 crc kubenswrapper[4974]: I1013 18:31:50.175284 4974 generic.go:334] "Generic (PLEG): container finished" podID="f29d4def-3da3-43a3-8331-f1ee4644dad2" containerID="eef82df6ce6cd9ca241efc80766eea2428b0a3c27c2c94a2478846595e58ac9a" exitCode=0 Oct 13 18:31:50 crc kubenswrapper[4974]: I1013 18:31:50.175352 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l2v67" event={"ID":"f29d4def-3da3-43a3-8331-f1ee4644dad2","Type":"ContainerDied","Data":"eef82df6ce6cd9ca241efc80766eea2428b0a3c27c2c94a2478846595e58ac9a"} Oct 13 18:31:50 crc kubenswrapper[4974]: I1013 18:31:50.179712 4974 generic.go:334] "Generic (PLEG): container finished" podID="9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" containerID="69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90" exitCode=1 Oct 13 18:31:50 crc kubenswrapper[4974]: I1013 18:31:50.179759 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e","Type":"ContainerDied","Data":"69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90"} Oct 13 18:31:50 crc kubenswrapper[4974]: I1013 18:31:50.180637 4974 scope.go:117] "RemoveContainer" containerID="69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.567190 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.643850 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-484wv" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.650161 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.673955 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.675220 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.710018 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2k25p\" (UniqueName: \"kubernetes.io/projected/af98b1ee-3954-4c18-9656-f61280f56b95-kube-api-access-2k25p\") pod \"af98b1ee-3954-4c18-9656-f61280f56b95\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.710068 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-credential-keys\") pod \"af98b1ee-3954-4c18-9656-f61280f56b95\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.710280 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-config-data\") pod \"af98b1ee-3954-4c18-9656-f61280f56b95\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.710303 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-combined-ca-bundle\") pod \"af98b1ee-3954-4c18-9656-f61280f56b95\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.710346 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-scripts\") pod \"af98b1ee-3954-4c18-9656-f61280f56b95\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.710375 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-fernet-keys\") pod \"af98b1ee-3954-4c18-9656-f61280f56b95\" (UID: \"af98b1ee-3954-4c18-9656-f61280f56b95\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.719816 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "af98b1ee-3954-4c18-9656-f61280f56b95" (UID: "af98b1ee-3954-4c18-9656-f61280f56b95"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.719919 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "af98b1ee-3954-4c18-9656-f61280f56b95" (UID: "af98b1ee-3954-4c18-9656-f61280f56b95"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.719817 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af98b1ee-3954-4c18-9656-f61280f56b95-kube-api-access-2k25p" (OuterVolumeSpecName: "kube-api-access-2k25p") pod "af98b1ee-3954-4c18-9656-f61280f56b95" (UID: "af98b1ee-3954-4c18-9656-f61280f56b95"). InnerVolumeSpecName "kube-api-access-2k25p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.721010 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-scripts" (OuterVolumeSpecName: "scripts") pod "af98b1ee-3954-4c18-9656-f61280f56b95" (UID: "af98b1ee-3954-4c18-9656-f61280f56b95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.747793 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-config-data" (OuterVolumeSpecName: "config-data") pod "af98b1ee-3954-4c18-9656-f61280f56b95" (UID: "af98b1ee-3954-4c18-9656-f61280f56b95"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.771741 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "af98b1ee-3954-4c18-9656-f61280f56b95" (UID: "af98b1ee-3954-4c18-9656-f61280f56b95"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.783003 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.784135 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.811438 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-scripts\") pod \"49fb758b-0291-4449-aa49-7e191cc1b2dc\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.811501 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-config-data\") pod \"49fb758b-0291-4449-aa49-7e191cc1b2dc\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.811559 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bht4g\" (UniqueName: \"kubernetes.io/projected/f29d4def-3da3-43a3-8331-f1ee4644dad2-kube-api-access-bht4g\") pod \"f29d4def-3da3-43a3-8331-f1ee4644dad2\" (UID: \"f29d4def-3da3-43a3-8331-f1ee4644dad2\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.811579 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29d4def-3da3-43a3-8331-f1ee4644dad2-combined-ca-bundle\") pod \"f29d4def-3da3-43a3-8331-f1ee4644dad2\" (UID: \"f29d4def-3da3-43a3-8331-f1ee4644dad2\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.811683 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49fb758b-0291-4449-aa49-7e191cc1b2dc-logs\") pod \"49fb758b-0291-4449-aa49-7e191cc1b2dc\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.811745 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f29d4def-3da3-43a3-8331-f1ee4644dad2-db-sync-config-data\") pod \"f29d4def-3da3-43a3-8331-f1ee4644dad2\" (UID: \"f29d4def-3da3-43a3-8331-f1ee4644dad2\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.811769 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gvmq\" (UniqueName: \"kubernetes.io/projected/49fb758b-0291-4449-aa49-7e191cc1b2dc-kube-api-access-6gvmq\") pod \"49fb758b-0291-4449-aa49-7e191cc1b2dc\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.811815 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-combined-ca-bundle\") pod \"49fb758b-0291-4449-aa49-7e191cc1b2dc\" (UID: \"49fb758b-0291-4449-aa49-7e191cc1b2dc\") " Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.812340 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.812362 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.812372 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.812381 4974 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.812390 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2k25p\" (UniqueName: \"kubernetes.io/projected/af98b1ee-3954-4c18-9656-f61280f56b95-kube-api-access-2k25p\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.812398 4974 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/af98b1ee-3954-4c18-9656-f61280f56b95-credential-keys\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.815834 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-scripts" (OuterVolumeSpecName: "scripts") pod "49fb758b-0291-4449-aa49-7e191cc1b2dc" (UID: "49fb758b-0291-4449-aa49-7e191cc1b2dc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.820721 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49fb758b-0291-4449-aa49-7e191cc1b2dc-logs" (OuterVolumeSpecName: "logs") pod "49fb758b-0291-4449-aa49-7e191cc1b2dc" (UID: "49fb758b-0291-4449-aa49-7e191cc1b2dc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.823636 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29d4def-3da3-43a3-8331-f1ee4644dad2-kube-api-access-bht4g" (OuterVolumeSpecName: "kube-api-access-bht4g") pod "f29d4def-3da3-43a3-8331-f1ee4644dad2" (UID: "f29d4def-3da3-43a3-8331-f1ee4644dad2"). InnerVolumeSpecName "kube-api-access-bht4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.826870 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49fb758b-0291-4449-aa49-7e191cc1b2dc-kube-api-access-6gvmq" (OuterVolumeSpecName: "kube-api-access-6gvmq") pod "49fb758b-0291-4449-aa49-7e191cc1b2dc" (UID: "49fb758b-0291-4449-aa49-7e191cc1b2dc"). InnerVolumeSpecName "kube-api-access-6gvmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.828213 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f29d4def-3da3-43a3-8331-f1ee4644dad2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f29d4def-3da3-43a3-8331-f1ee4644dad2" (UID: "f29d4def-3da3-43a3-8331-f1ee4644dad2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.850820 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f29d4def-3da3-43a3-8331-f1ee4644dad2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f29d4def-3da3-43a3-8331-f1ee4644dad2" (UID: "f29d4def-3da3-43a3-8331-f1ee4644dad2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.864958 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49fb758b-0291-4449-aa49-7e191cc1b2dc" (UID: "49fb758b-0291-4449-aa49-7e191cc1b2dc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.870358 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-config-data" (OuterVolumeSpecName: "config-data") pod "49fb758b-0291-4449-aa49-7e191cc1b2dc" (UID: "49fb758b-0291-4449-aa49-7e191cc1b2dc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.913935 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.914513 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bht4g\" (UniqueName: \"kubernetes.io/projected/f29d4def-3da3-43a3-8331-f1ee4644dad2-kube-api-access-bht4g\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.914540 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f29d4def-3da3-43a3-8331-f1ee4644dad2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.914562 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49fb758b-0291-4449-aa49-7e191cc1b2dc-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.914577 4974 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f29d4def-3da3-43a3-8331-f1ee4644dad2-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.914592 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gvmq\" (UniqueName: \"kubernetes.io/projected/49fb758b-0291-4449-aa49-7e191cc1b2dc-kube-api-access-6gvmq\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.914635 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:51 crc kubenswrapper[4974]: I1013 18:31:51.914690 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49fb758b-0291-4449-aa49-7e191cc1b2dc-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.176798 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.248595 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-l2v67" event={"ID":"f29d4def-3da3-43a3-8331-f1ee4644dad2","Type":"ContainerDied","Data":"a8ddb95503c6e81d03b2773af57e9348661c1e726c20422690017805867446fe"} Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.249117 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8ddb95503c6e81d03b2773af57e9348661c1e726c20422690017805867446fe" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.248668 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-l2v67" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.286135 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e","Type":"ContainerStarted","Data":"e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3"} Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.307517 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-484wv" event={"ID":"49fb758b-0291-4449-aa49-7e191cc1b2dc","Type":"ContainerDied","Data":"5db80e94ef9ef05328eeffeb6e5af6fb9728f58c3517938302089031313172a9"} Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.307556 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5db80e94ef9ef05328eeffeb6e5af6fb9728f58c3517938302089031313172a9" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.307625 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-484wv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.325869 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-2q69p" event={"ID":"af98b1ee-3954-4c18-9656-f61280f56b95","Type":"ContainerDied","Data":"bcbf368e16ea60b4ea1417514070ea11e79e2c88aa2c7e65e4ab4bd1f554fb3b"} Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.325907 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcbf368e16ea60b4ea1417514070ea11e79e2c88aa2c7e65e4ab4bd1f554fb3b" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.325984 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-2q69p" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.340059 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"300c099d-eca4-4f0c-a79f-dde4dddd8a98","Type":"ContainerStarted","Data":"817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7"} Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.382341 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58d9fd7b6c-5j6jb"] Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.382585 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" podUID="7ec9f25a-64d8-4904-b1df-03678b32a639" containerName="dnsmasq-dns" containerID="cri-o://0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8" gracePeriod=10 Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.441284 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6b66599996-gvfwf"] Oct 13 18:31:52 crc kubenswrapper[4974]: E1013 18:31:52.441795 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29d4def-3da3-43a3-8331-f1ee4644dad2" containerName="barbican-db-sync" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.441826 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29d4def-3da3-43a3-8331-f1ee4644dad2" containerName="barbican-db-sync" Oct 13 18:31:52 crc kubenswrapper[4974]: E1013 18:31:52.441841 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af98b1ee-3954-4c18-9656-f61280f56b95" containerName="keystone-bootstrap" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.441847 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="af98b1ee-3954-4c18-9656-f61280f56b95" containerName="keystone-bootstrap" Oct 13 18:31:52 crc kubenswrapper[4974]: E1013 18:31:52.441860 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49fb758b-0291-4449-aa49-7e191cc1b2dc" containerName="placement-db-sync" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.441867 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="49fb758b-0291-4449-aa49-7e191cc1b2dc" containerName="placement-db-sync" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.442092 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="49fb758b-0291-4449-aa49-7e191cc1b2dc" containerName="placement-db-sync" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.442115 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="af98b1ee-3954-4c18-9656-f61280f56b95" containerName="keystone-bootstrap" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.442148 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29d4def-3da3-43a3-8331-f1ee4644dad2" containerName="barbican-db-sync" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.443955 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.446065 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.446412 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.447162 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.448181 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.448571 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-nrvcn" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.454028 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b66599996-gvfwf"] Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.466997 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6f98cf4cc8-pgzsc"] Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.468145 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.470049 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.471046 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.471227 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.471333 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-788q9" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.471425 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.471516 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.476163 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f98cf4cc8-pgzsc"] Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531024 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-internal-tls-certs\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531089 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-public-tls-certs\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531111 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-scripts\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531154 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-combined-ca-bundle\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531187 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jddrm\" (UniqueName: \"kubernetes.io/projected/c6c335f4-044a-4970-8a80-05755d65b00a-kube-api-access-jddrm\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531530 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-public-tls-certs\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531575 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-config-data\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531615 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-combined-ca-bundle\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531663 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-internal-tls-certs\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531731 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-scripts\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531864 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-credential-keys\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531945 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-config-data\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.531975 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-fernet-keys\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.532010 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6c335f4-044a-4970-8a80-05755d65b00a-logs\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.532036 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nv9w\" (UniqueName: \"kubernetes.io/projected/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-kube-api-access-8nv9w\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.576450 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5b867446cf-7crxm"] Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.578619 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.586846 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.589911 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-77n5s" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.596222 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6f69856764-9cjzr"] Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.598265 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.603207 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.603914 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.614124 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5b867446cf-7crxm"] Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.632746 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f69856764-9cjzr"] Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.633749 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4plhb\" (UniqueName: \"kubernetes.io/projected/5acb3840-d265-46e7-8a2b-630f1bf38ec5-kube-api-access-4plhb\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.633780 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-credential-keys\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.633798 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5acb3840-d265-46e7-8a2b-630f1bf38ec5-config-data-custom\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.633829 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-config-data\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.633848 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-fernet-keys\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.633869 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6c335f4-044a-4970-8a80-05755d65b00a-logs\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.633883 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acb3840-d265-46e7-8a2b-630f1bf38ec5-config-data\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.633902 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nv9w\" (UniqueName: \"kubernetes.io/projected/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-kube-api-access-8nv9w\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.633920 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acb3840-d265-46e7-8a2b-630f1bf38ec5-combined-ca-bundle\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.633952 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-internal-tls-certs\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.633986 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-public-tls-certs\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.634003 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-scripts\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.634030 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-combined-ca-bundle\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.634059 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jddrm\" (UniqueName: \"kubernetes.io/projected/c6c335f4-044a-4970-8a80-05755d65b00a-kube-api-access-jddrm\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.634079 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-public-tls-certs\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.634096 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-config-data\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.634113 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-combined-ca-bundle\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.634131 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-internal-tls-certs\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.634157 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-scripts\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.634175 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5acb3840-d265-46e7-8a2b-630f1bf38ec5-logs\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.635630 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6c335f4-044a-4970-8a80-05755d65b00a-logs\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.656766 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-internal-tls-certs\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.664618 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-combined-ca-bundle\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.671359 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-public-tls-certs\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.672718 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c686c5b55-x94zv"] Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.674137 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.674720 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-internal-tls-certs\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.679371 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-config-data\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.679686 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-credential-keys\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.684278 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-config-data\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.684529 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-fernet-keys\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.688981 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jddrm\" (UniqueName: \"kubernetes.io/projected/c6c335f4-044a-4970-8a80-05755d65b00a-kube-api-access-jddrm\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.689523 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nv9w\" (UniqueName: \"kubernetes.io/projected/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-kube-api-access-8nv9w\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.689596 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-combined-ca-bundle\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.695157 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c686c5b55-x94zv"] Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.702613 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-public-tls-certs\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.702883 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52-scripts\") pod \"keystone-6f98cf4cc8-pgzsc\" (UID: \"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52\") " pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.707994 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c6c335f4-044a-4970-8a80-05755d65b00a-scripts\") pod \"placement-6b66599996-gvfwf\" (UID: \"c6c335f4-044a-4970-8a80-05755d65b00a\") " pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.722083 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.735305 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45c18dbd-7083-463d-b845-f213bf6ae1ce-config-data-custom\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.735341 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c18dbd-7083-463d-b845-f213bf6ae1ce-logs\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.735372 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-ovsdbserver-sb\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.735392 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c18dbd-7083-463d-b845-f213bf6ae1ce-config-data\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.735608 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-config\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.735641 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5acb3840-d265-46e7-8a2b-630f1bf38ec5-logs\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.735678 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2j74\" (UniqueName: \"kubernetes.io/projected/60c938e8-f8e0-4006-8124-6929b8945dbf-kube-api-access-t2j74\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.735701 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf9cp\" (UniqueName: \"kubernetes.io/projected/45c18dbd-7083-463d-b845-f213bf6ae1ce-kube-api-access-xf9cp\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.736100 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5acb3840-d265-46e7-8a2b-630f1bf38ec5-logs\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.736190 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4plhb\" (UniqueName: \"kubernetes.io/projected/5acb3840-d265-46e7-8a2b-630f1bf38ec5-kube-api-access-4plhb\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.736230 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5acb3840-d265-46e7-8a2b-630f1bf38ec5-config-data-custom\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.736283 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-dns-svc\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.736313 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c18dbd-7083-463d-b845-f213bf6ae1ce-combined-ca-bundle\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.736336 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-ovsdbserver-nb\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.736391 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acb3840-d265-46e7-8a2b-630f1bf38ec5-config-data\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.736406 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-dns-swift-storage-0\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.736434 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acb3840-d265-46e7-8a2b-630f1bf38ec5-combined-ca-bundle\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.754493 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5acb3840-d265-46e7-8a2b-630f1bf38ec5-config-data-custom\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.757425 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acb3840-d265-46e7-8a2b-630f1bf38ec5-config-data\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.767713 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4plhb\" (UniqueName: \"kubernetes.io/projected/5acb3840-d265-46e7-8a2b-630f1bf38ec5-kube-api-access-4plhb\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.772242 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acb3840-d265-46e7-8a2b-630f1bf38ec5-combined-ca-bundle\") pod \"barbican-worker-5b867446cf-7crxm\" (UID: \"5acb3840-d265-46e7-8a2b-630f1bf38ec5\") " pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.848703 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45c18dbd-7083-463d-b845-f213bf6ae1ce-config-data-custom\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.848752 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c18dbd-7083-463d-b845-f213bf6ae1ce-logs\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.848773 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-ovsdbserver-sb\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.848787 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c18dbd-7083-463d-b845-f213bf6ae1ce-config-data\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.848849 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-config\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.848877 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2j74\" (UniqueName: \"kubernetes.io/projected/60c938e8-f8e0-4006-8124-6929b8945dbf-kube-api-access-t2j74\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.848897 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf9cp\" (UniqueName: \"kubernetes.io/projected/45c18dbd-7083-463d-b845-f213bf6ae1ce-kube-api-access-xf9cp\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.848947 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-dns-svc\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.848965 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c18dbd-7083-463d-b845-f213bf6ae1ce-combined-ca-bundle\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.848980 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-ovsdbserver-nb\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.849004 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-dns-swift-storage-0\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.850119 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/45c18dbd-7083-463d-b845-f213bf6ae1ce-logs\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.850929 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-ovsdbserver-sb\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.851203 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.853514 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-config\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.854055 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/45c18dbd-7083-463d-b845-f213bf6ae1ce-config-data-custom\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.855272 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-dns-swift-storage-0\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.860486 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.860683 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7d5467cfd-m6tpn"] Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.862307 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.864081 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c18dbd-7083-463d-b845-f213bf6ae1ce-config-data\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.868046 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.871398 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2j74\" (UniqueName: \"kubernetes.io/projected/60c938e8-f8e0-4006-8124-6929b8945dbf-kube-api-access-t2j74\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.871815 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf9cp\" (UniqueName: \"kubernetes.io/projected/45c18dbd-7083-463d-b845-f213bf6ae1ce-kube-api-access-xf9cp\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.872848 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d5467cfd-m6tpn"] Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.878337 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.892493 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.895786 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-dns-svc\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.916812 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-ovsdbserver-nb\") pod \"dnsmasq-dns-6c686c5b55-x94zv\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.926544 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c18dbd-7083-463d-b845-f213bf6ae1ce-combined-ca-bundle\") pod \"barbican-keystone-listener-6f69856764-9cjzr\" (UID: \"45c18dbd-7083-463d-b845-f213bf6ae1ce\") " pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.927133 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.944583 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5b867446cf-7crxm" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.948153 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.950444 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5srs\" (UniqueName: \"kubernetes.io/projected/64946947-a861-4b5b-a016-102ced68c4b4-kube-api-access-m5srs\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.950469 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64946947-a861-4b5b-a016-102ced68c4b4-logs\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.950571 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-combined-ca-bundle\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.950645 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-config-data-custom\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.950763 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-config-data\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:52 crc kubenswrapper[4974]: I1013 18:31:52.976004 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.012932 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.053041 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-combined-ca-bundle\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.053336 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-config-data-custom\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.053433 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-config-data\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.053555 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5srs\" (UniqueName: \"kubernetes.io/projected/64946947-a861-4b5b-a016-102ced68c4b4-kube-api-access-m5srs\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.053683 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64946947-a861-4b5b-a016-102ced68c4b4-logs\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.058852 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64946947-a861-4b5b-a016-102ced68c4b4-logs\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.066262 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-config-data-custom\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.074185 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-combined-ca-bundle\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.077405 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5srs\" (UniqueName: \"kubernetes.io/projected/64946947-a861-4b5b-a016-102ced68c4b4-kube-api-access-m5srs\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.090843 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-config-data\") pod \"barbican-api-7d5467cfd-m6tpn\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.223379 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.245450 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.246381 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.335221 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.348916 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.396708 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-ovsdbserver-nb\") pod \"7ec9f25a-64d8-4904-b1df-03678b32a639\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.396783 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-dns-swift-storage-0\") pod \"7ec9f25a-64d8-4904-b1df-03678b32a639\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.396812 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-ovsdbserver-sb\") pod \"7ec9f25a-64d8-4904-b1df-03678b32a639\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.396829 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7rbf\" (UniqueName: \"kubernetes.io/projected/7ec9f25a-64d8-4904-b1df-03678b32a639-kube-api-access-z7rbf\") pod \"7ec9f25a-64d8-4904-b1df-03678b32a639\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.396928 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-config\") pod \"7ec9f25a-64d8-4904-b1df-03678b32a639\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.396977 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-dns-svc\") pod \"7ec9f25a-64d8-4904-b1df-03678b32a639\" (UID: \"7ec9f25a-64d8-4904-b1df-03678b32a639\") " Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.427154 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ec9f25a-64d8-4904-b1df-03678b32a639-kube-api-access-z7rbf" (OuterVolumeSpecName: "kube-api-access-z7rbf") pod "7ec9f25a-64d8-4904-b1df-03678b32a639" (UID: "7ec9f25a-64d8-4904-b1df-03678b32a639"). InnerVolumeSpecName "kube-api-access-z7rbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.474608 4974 generic.go:334] "Generic (PLEG): container finished" podID="7ec9f25a-64d8-4904-b1df-03678b32a639" containerID="0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8" exitCode=0 Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.475786 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.476152 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" event={"ID":"7ec9f25a-64d8-4904-b1df-03678b32a639","Type":"ContainerDied","Data":"0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8"} Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.476176 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58d9fd7b6c-5j6jb" event={"ID":"7ec9f25a-64d8-4904-b1df-03678b32a639","Type":"ContainerDied","Data":"26ad524b7b383bd640a52ef490e9fb8f0d3073fa9a1bd1bdee19624daf5fd88a"} Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.476191 4974 scope.go:117] "RemoveContainer" containerID="0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.477113 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.498061 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7ec9f25a-64d8-4904-b1df-03678b32a639" (UID: "7ec9f25a-64d8-4904-b1df-03678b32a639"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.499616 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7rbf\" (UniqueName: \"kubernetes.io/projected/7ec9f25a-64d8-4904-b1df-03678b32a639-kube-api-access-z7rbf\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.499634 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.527204 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-config" (OuterVolumeSpecName: "config") pod "7ec9f25a-64d8-4904-b1df-03678b32a639" (UID: "7ec9f25a-64d8-4904-b1df-03678b32a639"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.601471 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.615994 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7ec9f25a-64d8-4904-b1df-03678b32a639" (UID: "7ec9f25a-64d8-4904-b1df-03678b32a639"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.616814 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7ec9f25a-64d8-4904-b1df-03678b32a639" (UID: "7ec9f25a-64d8-4904-b1df-03678b32a639"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.616957 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7ec9f25a-64d8-4904-b1df-03678b32a639" (UID: "7ec9f25a-64d8-4904-b1df-03678b32a639"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.696174 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.701642 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.702827 4974 scope.go:117] "RemoveContainer" containerID="937ab733917d83140731d1032ed41ec628ebb7376d5e30cba5293727841a8821" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.709639 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.709708 4974 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.709718 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7ec9f25a-64d8-4904-b1df-03678b32a639-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.714968 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6b66599996-gvfwf"] Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.762434 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.781942 4974 scope.go:117] "RemoveContainer" containerID="0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8" Oct 13 18:31:53 crc kubenswrapper[4974]: E1013 18:31:53.786204 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8\": container with ID starting with 0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8 not found: ID does not exist" containerID="0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.786248 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8"} err="failed to get container status \"0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8\": rpc error: code = NotFound desc = could not find container \"0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8\": container with ID starting with 0807595d2144280a3e3f71f9e91e08dfb94701ac245700b879a5565c61853bd8 not found: ID does not exist" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.786276 4974 scope.go:117] "RemoveContainer" containerID="937ab733917d83140731d1032ed41ec628ebb7376d5e30cba5293727841a8821" Oct 13 18:31:53 crc kubenswrapper[4974]: E1013 18:31:53.793539 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"937ab733917d83140731d1032ed41ec628ebb7376d5e30cba5293727841a8821\": container with ID starting with 937ab733917d83140731d1032ed41ec628ebb7376d5e30cba5293727841a8821 not found: ID does not exist" containerID="937ab733917d83140731d1032ed41ec628ebb7376d5e30cba5293727841a8821" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.793583 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937ab733917d83140731d1032ed41ec628ebb7376d5e30cba5293727841a8821"} err="failed to get container status \"937ab733917d83140731d1032ed41ec628ebb7376d5e30cba5293727841a8821\": rpc error: code = NotFound desc = could not find container \"937ab733917d83140731d1032ed41ec628ebb7376d5e30cba5293727841a8821\": container with ID starting with 937ab733917d83140731d1032ed41ec628ebb7376d5e30cba5293727841a8821 not found: ID does not exist" Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.846927 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.929038 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:31:53 crc kubenswrapper[4974]: I1013 18:31:53.990623 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58d9fd7b6c-5j6jb"] Oct 13 18:31:54 crc kubenswrapper[4974]: I1013 18:31:54.002765 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58d9fd7b6c-5j6jb"] Oct 13 18:31:54 crc kubenswrapper[4974]: I1013 18:31:54.020359 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6f98cf4cc8-pgzsc"] Oct 13 18:31:54 crc kubenswrapper[4974]: I1013 18:31:54.182434 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5b867446cf-7crxm"] Oct 13 18:31:54 crc kubenswrapper[4974]: I1013 18:31:54.473019 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7d5467cfd-m6tpn"] Oct 13 18:31:54 crc kubenswrapper[4974]: I1013 18:31:54.493450 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6f69856764-9cjzr"] Oct 13 18:31:54 crc kubenswrapper[4974]: I1013 18:31:54.496089 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b66599996-gvfwf" event={"ID":"c6c335f4-044a-4970-8a80-05755d65b00a","Type":"ContainerStarted","Data":"ddf02bfa364fc2e8324aa4d62ea91318f5402e1f3bb4a0a8cc416a58500f610c"} Oct 13 18:31:54 crc kubenswrapper[4974]: I1013 18:31:54.507665 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c686c5b55-x94zv"] Oct 13 18:31:55 crc kubenswrapper[4974]: I1013 18:31:55.502055 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="751ed806-3991-4b72-91fb-ff5d56c66849" containerName="watcher-applier" containerID="cri-o://9f3538a3336ad09da6e2d412fce7099b1f11061277d16c5d59fe29b11072dbbc" gracePeriod=30 Oct 13 18:31:55 crc kubenswrapper[4974]: I1013 18:31:55.502238 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" containerName="watcher-decision-engine" containerID="cri-o://e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3" gracePeriod=30 Oct 13 18:31:55 crc kubenswrapper[4974]: I1013 18:31:55.830985 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ec9f25a-64d8-4904-b1df-03678b32a639" path="/var/lib/kubelet/pods/7ec9f25a-64d8-4904-b1df-03678b32a639/volumes" Oct 13 18:31:55 crc kubenswrapper[4974]: I1013 18:31:55.997005 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-cf595f5c8-dtfck"] Oct 13 18:31:55 crc kubenswrapper[4974]: E1013 18:31:55.997492 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec9f25a-64d8-4904-b1df-03678b32a639" containerName="dnsmasq-dns" Oct 13 18:31:55 crc kubenswrapper[4974]: I1013 18:31:55.997516 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec9f25a-64d8-4904-b1df-03678b32a639" containerName="dnsmasq-dns" Oct 13 18:31:55 crc kubenswrapper[4974]: E1013 18:31:55.997530 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ec9f25a-64d8-4904-b1df-03678b32a639" containerName="init" Oct 13 18:31:55 crc kubenswrapper[4974]: I1013 18:31:55.997538 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ec9f25a-64d8-4904-b1df-03678b32a639" containerName="init" Oct 13 18:31:55 crc kubenswrapper[4974]: I1013 18:31:55.997788 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ec9f25a-64d8-4904-b1df-03678b32a639" containerName="dnsmasq-dns" Oct 13 18:31:55 crc kubenswrapper[4974]: I1013 18:31:55.999096 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.001552 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.007417 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.014634 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cf595f5c8-dtfck"] Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.155472 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94lp4\" (UniqueName: \"kubernetes.io/projected/b67ca997-2edf-492b-ab80-f618c7201a29-kube-api-access-94lp4\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.155697 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b67ca997-2edf-492b-ab80-f618c7201a29-logs\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.155924 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-public-tls-certs\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.156072 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-config-data\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.156168 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-internal-tls-certs\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.156327 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-combined-ca-bundle\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.156501 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-config-data-custom\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.258094 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-combined-ca-bundle\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.258144 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-config-data-custom\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.258186 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94lp4\" (UniqueName: \"kubernetes.io/projected/b67ca997-2edf-492b-ab80-f618c7201a29-kube-api-access-94lp4\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.258209 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b67ca997-2edf-492b-ab80-f618c7201a29-logs\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.258262 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-public-tls-certs\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.258294 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-config-data\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.258332 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-internal-tls-certs\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.259037 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b67ca997-2edf-492b-ab80-f618c7201a29-logs\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.264436 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-combined-ca-bundle\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.264449 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-public-tls-certs\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.267424 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-config-data-custom\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.268323 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-config-data\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.282137 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b67ca997-2edf-492b-ab80-f618c7201a29-internal-tls-certs\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.295108 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94lp4\" (UniqueName: \"kubernetes.io/projected/b67ca997-2edf-492b-ab80-f618c7201a29-kube-api-access-94lp4\") pod \"barbican-api-cf595f5c8-dtfck\" (UID: \"b67ca997-2edf-492b-ab80-f618c7201a29\") " pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.325276 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:31:56 crc kubenswrapper[4974]: I1013 18:31:56.514241 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jgthj" event={"ID":"ab455ee5-f8bd-4d4e-b179-524fa3edcc52","Type":"ContainerStarted","Data":"ff5dec3b18644db01f42beb495cb595475713647a9fa6ae84f9caa9404dd6b33"} Oct 13 18:31:56 crc kubenswrapper[4974]: W1013 18:31:56.628375 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63d27b98_4a56_4fbc_a0c6_dd31bd3dfc52.slice/crio-5ca858da1ac99bbe52aeae45187fded7f47f5fe73a11153d3d65deb4e58c2596 WatchSource:0}: Error finding container 5ca858da1ac99bbe52aeae45187fded7f47f5fe73a11153d3d65deb4e58c2596: Status 404 returned error can't find the container with id 5ca858da1ac99bbe52aeae45187fded7f47f5fe73a11153d3d65deb4e58c2596 Oct 13 18:31:56 crc kubenswrapper[4974]: W1013 18:31:56.630275 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5acb3840_d265_46e7_8a2b_630f1bf38ec5.slice/crio-8f1b8a784dc855b1770b70ad2f643ea9c863fd6a63a898ba51d7ea091165fff3 WatchSource:0}: Error finding container 8f1b8a784dc855b1770b70ad2f643ea9c863fd6a63a898ba51d7ea091165fff3: Status 404 returned error can't find the container with id 8f1b8a784dc855b1770b70ad2f643ea9c863fd6a63a898ba51d7ea091165fff3 Oct 13 18:31:56 crc kubenswrapper[4974]: W1013 18:31:56.633243 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45c18dbd_7083_463d_b845_f213bf6ae1ce.slice/crio-923c33774e5a22e46a8f5a04a589505375dba49ec521b47342d8b7ba373a76e7 WatchSource:0}: Error finding container 923c33774e5a22e46a8f5a04a589505375dba49ec521b47342d8b7ba373a76e7: Status 404 returned error can't find the container with id 923c33774e5a22e46a8f5a04a589505375dba49ec521b47342d8b7ba373a76e7 Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.208572 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.307999 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-config-data\") pod \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.308086 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-logs\") pod \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.308206 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-custom-prometheus-ca\") pod \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.308227 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-combined-ca-bundle\") pod \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.308263 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxvct\" (UniqueName: \"kubernetes.io/projected/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-kube-api-access-zxvct\") pod \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\" (UID: \"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e\") " Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.311033 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-logs" (OuterVolumeSpecName: "logs") pod "9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" (UID: "9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.328167 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-cf595f5c8-dtfck"] Oct 13 18:31:57 crc kubenswrapper[4974]: W1013 18:31:57.335209 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb67ca997_2edf_492b_ab80_f618c7201a29.slice/crio-709f8ff1855e61e803994c8d7dbc775f8db0af8da6a35447db0f8b53991d7c60 WatchSource:0}: Error finding container 709f8ff1855e61e803994c8d7dbc775f8db0af8da6a35447db0f8b53991d7c60: Status 404 returned error can't find the container with id 709f8ff1855e61e803994c8d7dbc775f8db0af8da6a35447db0f8b53991d7c60 Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.337243 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-kube-api-access-zxvct" (OuterVolumeSpecName: "kube-api-access-zxvct") pod "9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" (UID: "9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e"). InnerVolumeSpecName "kube-api-access-zxvct". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.417888 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.417914 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxvct\" (UniqueName: \"kubernetes.io/projected/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-kube-api-access-zxvct\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.451709 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" (UID: "9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.541327 4974 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.564790 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" (UID: "9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.575825 4974 generic.go:334] "Generic (PLEG): container finished" podID="60c938e8-f8e0-4006-8124-6929b8945dbf" containerID="b7a68067e9d66fe4cbe9189cfc06138f4c33495c46074ad6cb9688d6e26788ec" exitCode=0 Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.575886 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" event={"ID":"60c938e8-f8e0-4006-8124-6929b8945dbf","Type":"ContainerDied","Data":"b7a68067e9d66fe4cbe9189cfc06138f4c33495c46074ad6cb9688d6e26788ec"} Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.575913 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" event={"ID":"60c938e8-f8e0-4006-8124-6929b8945dbf","Type":"ContainerStarted","Data":"23cf2bcc56f8eed0a0309a87aa287f4781dc9840b33d8e02a4b92d5ac319f8c2"} Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.602773 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-config-data" (OuterVolumeSpecName: "config-data") pod "9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" (UID: "9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.634861 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" event={"ID":"45c18dbd-7083-463d-b845-f213bf6ae1ce","Type":"ContainerStarted","Data":"923c33774e5a22e46a8f5a04a589505375dba49ec521b47342d8b7ba373a76e7"} Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.645438 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.645465 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.654567 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cf595f5c8-dtfck" event={"ID":"b67ca997-2edf-492b-ab80-f618c7201a29","Type":"ContainerStarted","Data":"709f8ff1855e61e803994c8d7dbc775f8db0af8da6a35447db0f8b53991d7c60"} Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.694582 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f98cf4cc8-pgzsc" event={"ID":"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52","Type":"ContainerStarted","Data":"6974123fa4c36e6df3bb156d9d85fd65b0e3de3870c4d15c3168ca71a6e5be0e"} Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.694628 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6f98cf4cc8-pgzsc" event={"ID":"63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52","Type":"ContainerStarted","Data":"5ca858da1ac99bbe52aeae45187fded7f47f5fe73a11153d3d65deb4e58c2596"} Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.696759 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.716806 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5b867446cf-7crxm" event={"ID":"5acb3840-d265-46e7-8a2b-630f1bf38ec5","Type":"ContainerStarted","Data":"8f1b8a784dc855b1770b70ad2f643ea9c863fd6a63a898ba51d7ea091165fff3"} Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.739402 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d5467cfd-m6tpn" event={"ID":"64946947-a861-4b5b-a016-102ced68c4b4","Type":"ContainerStarted","Data":"769bd83937464a78d16b6ccd0c10f94e0f1cbd354f2fbab5a4a538773fc255ac"} Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.739780 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d5467cfd-m6tpn" event={"ID":"64946947-a861-4b5b-a016-102ced68c4b4","Type":"ContainerStarted","Data":"ed6372e23a7facf06ddebe859f7f0ebe70fb9505e4ed3e0a9b537c12bfdf1294"} Oct 13 18:31:57 crc kubenswrapper[4974]: E1013 18:31:57.750791 4974 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f3538a3336ad09da6e2d412fce7099b1f11061277d16c5d59fe29b11072dbbc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 13 18:31:57 crc kubenswrapper[4974]: E1013 18:31:57.767286 4974 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f3538a3336ad09da6e2d412fce7099b1f11061277d16c5d59fe29b11072dbbc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.769573 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b66599996-gvfwf" event={"ID":"c6c335f4-044a-4970-8a80-05755d65b00a","Type":"ContainerStarted","Data":"0a31de9b3f2236c558d48f0998bfea2d29c14d99b834422c7b5883e22738cdbe"} Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.773636 4974 generic.go:334] "Generic (PLEG): container finished" podID="9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" containerID="e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3" exitCode=1 Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.774630 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.782809 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e","Type":"ContainerDied","Data":"e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3"} Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.782858 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e","Type":"ContainerDied","Data":"cd59784eb3d7b995313f040942c41bf27d4ae800a62031cb0c30345bda5f3a51"} Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.782875 4974 scope.go:117] "RemoveContainer" containerID="e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.849535 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-jgthj" podStartSLOduration=6.990832086 podStartE2EDuration="56.849510277s" podCreationTimestamp="2025-10-13 18:31:01 +0000 UTC" firstStartedPulling="2025-10-13 18:31:03.183455119 +0000 UTC m=+998.087821199" lastFinishedPulling="2025-10-13 18:31:53.04213331 +0000 UTC m=+1047.946499390" observedRunningTime="2025-10-13 18:31:57.840538234 +0000 UTC m=+1052.744904314" watchObservedRunningTime="2025-10-13 18:31:57.849510277 +0000 UTC m=+1052.753876357" Oct 13 18:31:57 crc kubenswrapper[4974]: I1013 18:31:57.850428 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-6f98cf4cc8-pgzsc" podStartSLOduration=5.850422213 podStartE2EDuration="5.850422213s" podCreationTimestamp="2025-10-13 18:31:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:57.754143968 +0000 UTC m=+1052.658510048" watchObservedRunningTime="2025-10-13 18:31:57.850422213 +0000 UTC m=+1052.754788293" Oct 13 18:31:57 crc kubenswrapper[4974]: E1013 18:31:57.859867 4974 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="9f3538a3336ad09da6e2d412fce7099b1f11061277d16c5d59fe29b11072dbbc" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Oct 13 18:31:57 crc kubenswrapper[4974]: E1013 18:31:57.859942 4974 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="751ed806-3991-4b72-91fb-ff5d56c66849" containerName="watcher-applier" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.025779 4974 scope.go:117] "RemoveContainer" containerID="69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.032392 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.059374 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.077705 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:31:58 crc kubenswrapper[4974]: E1013 18:31:58.078152 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" containerName="watcher-decision-engine" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.078167 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" containerName="watcher-decision-engine" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.078361 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" containerName="watcher-decision-engine" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.078383 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" containerName="watcher-decision-engine" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.079171 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.081624 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.084713 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.198008 4974 scope.go:117] "RemoveContainer" containerID="e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3" Oct 13 18:31:58 crc kubenswrapper[4974]: E1013 18:31:58.202158 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3\": container with ID starting with e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3 not found: ID does not exist" containerID="e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.202188 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3"} err="failed to get container status \"e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3\": rpc error: code = NotFound desc = could not find container \"e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3\": container with ID starting with e7ad16b93233fbe7004a0232bb5e725bbec0d3e388d92bdd3cbd690b07436ed3 not found: ID does not exist" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.202210 4974 scope.go:117] "RemoveContainer" containerID="69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90" Oct 13 18:31:58 crc kubenswrapper[4974]: E1013 18:31:58.202492 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90\": container with ID starting with 69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90 not found: ID does not exist" containerID="69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.202510 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90"} err="failed to get container status \"69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90\": rpc error: code = NotFound desc = could not find container \"69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90\": container with ID starting with 69be1106350c849f128d2c8a714591bcb40b52a891aee85eaddea340efdd6c90 not found: ID does not exist" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.260385 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89790087-1d9c-4278-b62f-e18a94775048-logs\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.260430 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-config-data\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.260490 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.260890 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.260949 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6cz9\" (UniqueName: \"kubernetes.io/projected/89790087-1d9c-4278-b62f-e18a94775048-kube-api-access-c6cz9\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.363069 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.363387 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6cz9\" (UniqueName: \"kubernetes.io/projected/89790087-1d9c-4278-b62f-e18a94775048-kube-api-access-c6cz9\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.363520 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89790087-1d9c-4278-b62f-e18a94775048-logs\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.363636 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-config-data\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.363765 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.366731 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89790087-1d9c-4278-b62f-e18a94775048-logs\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.371158 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.373237 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.374330 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-config-data\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.387825 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6cz9\" (UniqueName: \"kubernetes.io/projected/89790087-1d9c-4278-b62f-e18a94775048-kube-api-access-c6cz9\") pod \"watcher-decision-engine-0\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.416134 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.786769 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cf595f5c8-dtfck" event={"ID":"b67ca997-2edf-492b-ab80-f618c7201a29","Type":"ContainerStarted","Data":"3dacf4f9a3e8616ccdf4f2ec55cb4ce3f0c9c18d4869294c42baa10e51f4b829"} Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.788927 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-njgsh" event={"ID":"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3","Type":"ContainerStarted","Data":"010bc063ed964ef76d8cfee518d76c71e443363c2c2521f1ecac16bb15de3e8a"} Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.793003 4974 generic.go:334] "Generic (PLEG): container finished" podID="751ed806-3991-4b72-91fb-ff5d56c66849" containerID="9f3538a3336ad09da6e2d412fce7099b1f11061277d16c5d59fe29b11072dbbc" exitCode=0 Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.793054 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"751ed806-3991-4b72-91fb-ff5d56c66849","Type":"ContainerDied","Data":"9f3538a3336ad09da6e2d412fce7099b1f11061277d16c5d59fe29b11072dbbc"} Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.794540 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d5467cfd-m6tpn" event={"ID":"64946947-a861-4b5b-a016-102ced68c4b4","Type":"ContainerStarted","Data":"8d558a53da031efb37d69bc361855430a83cfb892a4e39edc9706487e0c2624c"} Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.794740 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.794778 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.796255 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6b66599996-gvfwf" event={"ID":"c6c335f4-044a-4970-8a80-05755d65b00a","Type":"ContainerStarted","Data":"c254f6df782c8d05cb44646e021adcfade22ded34a72bd101a1154ee05bc1b1f"} Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.797399 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.797435 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.810194 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-njgsh" podStartSLOduration=3.616707098 podStartE2EDuration="55.810179315s" podCreationTimestamp="2025-10-13 18:31:03 +0000 UTC" firstStartedPulling="2025-10-13 18:31:04.596097443 +0000 UTC m=+999.500463523" lastFinishedPulling="2025-10-13 18:31:56.78956966 +0000 UTC m=+1051.693935740" observedRunningTime="2025-10-13 18:31:58.807101439 +0000 UTC m=+1053.711467519" watchObservedRunningTime="2025-10-13 18:31:58.810179315 +0000 UTC m=+1053.714545395" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.822033 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" event={"ID":"60c938e8-f8e0-4006-8124-6929b8945dbf","Type":"ContainerStarted","Data":"f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694"} Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.822256 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.838491 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7d5467cfd-m6tpn" podStartSLOduration=6.8384715830000005 podStartE2EDuration="6.838471583s" podCreationTimestamp="2025-10-13 18:31:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:58.829362696 +0000 UTC m=+1053.733728776" watchObservedRunningTime="2025-10-13 18:31:58.838471583 +0000 UTC m=+1053.742837663" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.877411 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6b66599996-gvfwf" podStartSLOduration=6.877392171 podStartE2EDuration="6.877392171s" podCreationTimestamp="2025-10-13 18:31:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:58.859939179 +0000 UTC m=+1053.764305259" watchObservedRunningTime="2025-10-13 18:31:58.877392171 +0000 UTC m=+1053.781758271" Oct 13 18:31:58 crc kubenswrapper[4974]: I1013 18:31:58.883680 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" podStartSLOduration=6.883645747 podStartE2EDuration="6.883645747s" podCreationTimestamp="2025-10-13 18:31:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:31:58.877916795 +0000 UTC m=+1053.782282875" watchObservedRunningTime="2025-10-13 18:31:58.883645747 +0000 UTC m=+1053.788011827" Oct 13 18:31:59 crc kubenswrapper[4974]: I1013 18:31:59.532040 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:31:59 crc kubenswrapper[4974]: I1013 18:31:59.532475 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="c6bd0d0c-5130-4398-91e9-e96652bcae59" containerName="watcher-api-log" containerID="cri-o://3991c5ce67933228abbdade41680d3fa186d322e2a0ef27f9e529aa4b15eb37f" gracePeriod=30 Oct 13 18:31:59 crc kubenswrapper[4974]: I1013 18:31:59.532762 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="c6bd0d0c-5130-4398-91e9-e96652bcae59" containerName="watcher-api" containerID="cri-o://029f51b6c8020fdfde3aa8a4b4c24f4b5f134cb93da0a7b729b1880376f917b7" gracePeriod=30 Oct 13 18:31:59 crc kubenswrapper[4974]: I1013 18:31:59.825673 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" path="/var/lib/kubelet/pods/9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e/volumes" Oct 13 18:31:59 crc kubenswrapper[4974]: I1013 18:31:59.850072 4974 generic.go:334] "Generic (PLEG): container finished" podID="c6bd0d0c-5130-4398-91e9-e96652bcae59" containerID="3991c5ce67933228abbdade41680d3fa186d322e2a0ef27f9e529aa4b15eb37f" exitCode=143 Oct 13 18:31:59 crc kubenswrapper[4974]: I1013 18:31:59.850208 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c6bd0d0c-5130-4398-91e9-e96652bcae59","Type":"ContainerDied","Data":"3991c5ce67933228abbdade41680d3fa186d322e2a0ef27f9e529aa4b15eb37f"} Oct 13 18:32:00 crc kubenswrapper[4974]: I1013 18:32:00.889416 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" event={"ID":"45c18dbd-7083-463d-b845-f213bf6ae1ce","Type":"ContainerStarted","Data":"f12e71e1bf9cb987fc83811bf8f165d7e94a4524d9876c180666f6397f412c5f"} Oct 13 18:32:00 crc kubenswrapper[4974]: I1013 18:32:00.894269 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-cf595f5c8-dtfck" event={"ID":"b67ca997-2edf-492b-ab80-f618c7201a29","Type":"ContainerStarted","Data":"bc8adaa27a361eec107d79df9343ca3f33ec5a0803e9809b7fea3f9524859729"} Oct 13 18:32:00 crc kubenswrapper[4974]: I1013 18:32:00.894905 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:32:00 crc kubenswrapper[4974]: I1013 18:32:00.895317 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:32:00 crc kubenswrapper[4974]: I1013 18:32:00.906236 4974 generic.go:334] "Generic (PLEG): container finished" podID="c6bd0d0c-5130-4398-91e9-e96652bcae59" containerID="029f51b6c8020fdfde3aa8a4b4c24f4b5f134cb93da0a7b729b1880376f917b7" exitCode=0 Oct 13 18:32:00 crc kubenswrapper[4974]: I1013 18:32:00.906322 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c6bd0d0c-5130-4398-91e9-e96652bcae59","Type":"ContainerDied","Data":"029f51b6c8020fdfde3aa8a4b4c24f4b5f134cb93da0a7b729b1880376f917b7"} Oct 13 18:32:00 crc kubenswrapper[4974]: I1013 18:32:00.936319 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-cf595f5c8-dtfck" podStartSLOduration=5.936304807 podStartE2EDuration="5.936304807s" podCreationTimestamp="2025-10-13 18:31:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:00.926674985 +0000 UTC m=+1055.831041055" watchObservedRunningTime="2025-10-13 18:32:00.936304807 +0000 UTC m=+1055.840670877" Oct 13 18:32:00 crc kubenswrapper[4974]: I1013 18:32:00.955158 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.112713 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.131886 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/751ed806-3991-4b72-91fb-ff5d56c66849-logs\") pod \"751ed806-3991-4b72-91fb-ff5d56c66849\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.132452 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/751ed806-3991-4b72-91fb-ff5d56c66849-logs" (OuterVolumeSpecName: "logs") pod "751ed806-3991-4b72-91fb-ff5d56c66849" (UID: "751ed806-3991-4b72-91fb-ff5d56c66849"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.132811 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751ed806-3991-4b72-91fb-ff5d56c66849-combined-ca-bundle\") pod \"751ed806-3991-4b72-91fb-ff5d56c66849\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.133527 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/751ed806-3991-4b72-91fb-ff5d56c66849-config-data\") pod \"751ed806-3991-4b72-91fb-ff5d56c66849\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.135502 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pps8k\" (UniqueName: \"kubernetes.io/projected/751ed806-3991-4b72-91fb-ff5d56c66849-kube-api-access-pps8k\") pod \"751ed806-3991-4b72-91fb-ff5d56c66849\" (UID: \"751ed806-3991-4b72-91fb-ff5d56c66849\") " Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.138166 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/751ed806-3991-4b72-91fb-ff5d56c66849-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.151905 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/751ed806-3991-4b72-91fb-ff5d56c66849-kube-api-access-pps8k" (OuterVolumeSpecName: "kube-api-access-pps8k") pod "751ed806-3991-4b72-91fb-ff5d56c66849" (UID: "751ed806-3991-4b72-91fb-ff5d56c66849"). InnerVolumeSpecName "kube-api-access-pps8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.239397 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-custom-prometheus-ca\") pod \"c6bd0d0c-5130-4398-91e9-e96652bcae59\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.239449 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6bd0d0c-5130-4398-91e9-e96652bcae59-logs\") pod \"c6bd0d0c-5130-4398-91e9-e96652bcae59\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.239789 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-combined-ca-bundle\") pod \"c6bd0d0c-5130-4398-91e9-e96652bcae59\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.239837 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8g6t\" (UniqueName: \"kubernetes.io/projected/c6bd0d0c-5130-4398-91e9-e96652bcae59-kube-api-access-n8g6t\") pod \"c6bd0d0c-5130-4398-91e9-e96652bcae59\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.239863 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-config-data\") pod \"c6bd0d0c-5130-4398-91e9-e96652bcae59\" (UID: \"c6bd0d0c-5130-4398-91e9-e96652bcae59\") " Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.240399 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pps8k\" (UniqueName: \"kubernetes.io/projected/751ed806-3991-4b72-91fb-ff5d56c66849-kube-api-access-pps8k\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.248502 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6bd0d0c-5130-4398-91e9-e96652bcae59-logs" (OuterVolumeSpecName: "logs") pod "c6bd0d0c-5130-4398-91e9-e96652bcae59" (UID: "c6bd0d0c-5130-4398-91e9-e96652bcae59"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.248978 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/751ed806-3991-4b72-91fb-ff5d56c66849-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "751ed806-3991-4b72-91fb-ff5d56c66849" (UID: "751ed806-3991-4b72-91fb-ff5d56c66849"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.259729 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6bd0d0c-5130-4398-91e9-e96652bcae59-kube-api-access-n8g6t" (OuterVolumeSpecName: "kube-api-access-n8g6t") pod "c6bd0d0c-5130-4398-91e9-e96652bcae59" (UID: "c6bd0d0c-5130-4398-91e9-e96652bcae59"). InnerVolumeSpecName "kube-api-access-n8g6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.264072 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.341527 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c6bd0d0c-5130-4398-91e9-e96652bcae59" (UID: "c6bd0d0c-5130-4398-91e9-e96652bcae59"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.341573 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "c6bd0d0c-5130-4398-91e9-e96652bcae59" (UID: "c6bd0d0c-5130-4398-91e9-e96652bcae59"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.342497 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/751ed806-3991-4b72-91fb-ff5d56c66849-config-data" (OuterVolumeSpecName: "config-data") pod "751ed806-3991-4b72-91fb-ff5d56c66849" (UID: "751ed806-3991-4b72-91fb-ff5d56c66849"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.342985 4974 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.343061 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c6bd0d0c-5130-4398-91e9-e96652bcae59-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.343117 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751ed806-3991-4b72-91fb-ff5d56c66849-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.343175 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/751ed806-3991-4b72-91fb-ff5d56c66849-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.343239 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.343308 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8g6t\" (UniqueName: \"kubernetes.io/projected/c6bd0d0c-5130-4398-91e9-e96652bcae59-kube-api-access-n8g6t\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.391908 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-config-data" (OuterVolumeSpecName: "config-data") pod "c6bd0d0c-5130-4398-91e9-e96652bcae59" (UID: "c6bd0d0c-5130-4398-91e9-e96652bcae59"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.444861 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c6bd0d0c-5130-4398-91e9-e96652bcae59-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.955773 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" event={"ID":"45c18dbd-7083-463d-b845-f213bf6ae1ce","Type":"ContainerStarted","Data":"587ea6bf93e51f3444582b6d0bd35bb1acf11c11786546ec26be1751a4e96ffe"} Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.959512 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"c6bd0d0c-5130-4398-91e9-e96652bcae59","Type":"ContainerDied","Data":"fb6f6877e16f249f11a5003065c97736a8a6a516dcd76fd0feaa71aa98b00f67"} Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.959548 4974 scope.go:117] "RemoveContainer" containerID="029f51b6c8020fdfde3aa8a4b4c24f4b5f134cb93da0a7b729b1880376f917b7" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.959630 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.964126 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5b867446cf-7crxm" event={"ID":"5acb3840-d265-46e7-8a2b-630f1bf38ec5","Type":"ContainerStarted","Data":"ae4df010533b403cc5a373f79a63695fb218239789777959c5e726d9233229b5"} Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.964216 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5b867446cf-7crxm" event={"ID":"5acb3840-d265-46e7-8a2b-630f1bf38ec5","Type":"ContainerStarted","Data":"cc4533c0c71428764e602908fb575477c71b0e65f985c7f3e7a3341d4a0d8a81"} Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.966115 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"89790087-1d9c-4278-b62f-e18a94775048","Type":"ContainerStarted","Data":"94d1eefd3d192e105822b0db71e2378748ce57d54984aef971389806e05dbccf"} Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.966143 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"89790087-1d9c-4278-b62f-e18a94775048","Type":"ContainerStarted","Data":"3af3f161b32a6514d44b876c084d404f4da39e7ace812ecf8a18cd4aa4700289"} Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.969091 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.969101 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"751ed806-3991-4b72-91fb-ff5d56c66849","Type":"ContainerDied","Data":"d5353e925e50561d2ebc6a87d768334b873495b6fdd00da63a7fe267cc51d326"} Oct 13 18:32:01 crc kubenswrapper[4974]: I1013 18:32:01.980788 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6f69856764-9cjzr" podStartSLOduration=6.007105642 podStartE2EDuration="9.980770509s" podCreationTimestamp="2025-10-13 18:31:52 +0000 UTC" firstStartedPulling="2025-10-13 18:31:56.635931207 +0000 UTC m=+1051.540297287" lastFinishedPulling="2025-10-13 18:32:00.609596074 +0000 UTC m=+1055.513962154" observedRunningTime="2025-10-13 18:32:01.971961391 +0000 UTC m=+1056.876327471" watchObservedRunningTime="2025-10-13 18:32:01.980770509 +0000 UTC m=+1056.885136589" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.004341 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.025507 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.040771 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:32:02 crc kubenswrapper[4974]: E1013 18:32:02.041295 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6bd0d0c-5130-4398-91e9-e96652bcae59" containerName="watcher-api-log" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.041320 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bd0d0c-5130-4398-91e9-e96652bcae59" containerName="watcher-api-log" Oct 13 18:32:02 crc kubenswrapper[4974]: E1013 18:32:02.041343 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6bd0d0c-5130-4398-91e9-e96652bcae59" containerName="watcher-api" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.041352 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bd0d0c-5130-4398-91e9-e96652bcae59" containerName="watcher-api" Oct 13 18:32:02 crc kubenswrapper[4974]: E1013 18:32:02.041386 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751ed806-3991-4b72-91fb-ff5d56c66849" containerName="watcher-applier" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.041394 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="751ed806-3991-4b72-91fb-ff5d56c66849" containerName="watcher-applier" Oct 13 18:32:02 crc kubenswrapper[4974]: E1013 18:32:02.041420 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" containerName="watcher-decision-engine" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.041428 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fb7f805-1ab4-4f3d-b378-6ea404ac3d2e" containerName="watcher-decision-engine" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.041679 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6bd0d0c-5130-4398-91e9-e96652bcae59" containerName="watcher-api" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.041708 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6bd0d0c-5130-4398-91e9-e96652bcae59" containerName="watcher-api-log" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.041736 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="751ed806-3991-4b72-91fb-ff5d56c66849" containerName="watcher-applier" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.043049 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.045993 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.046263 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.046382 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.076631 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5b867446cf-7crxm" podStartSLOduration=6.118229165 podStartE2EDuration="10.076609251s" podCreationTimestamp="2025-10-13 18:31:52 +0000 UTC" firstStartedPulling="2025-10-13 18:31:56.632195242 +0000 UTC m=+1051.536561312" lastFinishedPulling="2025-10-13 18:32:00.590575318 +0000 UTC m=+1055.494941398" observedRunningTime="2025-10-13 18:32:02.019351207 +0000 UTC m=+1056.923717287" watchObservedRunningTime="2025-10-13 18:32:02.076609251 +0000 UTC m=+1056.980975361" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.116097 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.124103 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=4.12408584 podStartE2EDuration="4.12408584s" podCreationTimestamp="2025-10-13 18:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:02.046579524 +0000 UTC m=+1056.950945624" watchObservedRunningTime="2025-10-13 18:32:02.12408584 +0000 UTC m=+1057.028451920" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.159742 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.159788 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-config-data\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.159808 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b25h6\" (UniqueName: \"kubernetes.io/projected/5c387632-008a-4609-8b64-ff84c35596c7-kube-api-access-b25h6\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.159852 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.159904 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.159955 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c387632-008a-4609-8b64-ff84c35596c7-logs\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.159974 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.174725 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.188726 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.197719 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.198980 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.203707 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.216996 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.261960 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.262013 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-config-data\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.262035 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b25h6\" (UniqueName: \"kubernetes.io/projected/5c387632-008a-4609-8b64-ff84c35596c7-kube-api-access-b25h6\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.262070 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.262111 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.262145 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c387632-008a-4609-8b64-ff84c35596c7-logs\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.262162 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.265595 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c387632-008a-4609-8b64-ff84c35596c7-logs\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.268271 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-public-tls-certs\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.269318 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.279591 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.280265 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-config-data\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.281431 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b25h6\" (UniqueName: \"kubernetes.io/projected/5c387632-008a-4609-8b64-ff84c35596c7-kube-api-access-b25h6\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.300073 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c387632-008a-4609-8b64-ff84c35596c7-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"5c387632-008a-4609-8b64-ff84c35596c7\") " pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.362558 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.363284 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc2497ce-b7ed-481e-88c0-eb2e7aef34f9-logs\") pod \"watcher-applier-0\" (UID: \"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9\") " pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.363312 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2497ce-b7ed-481e-88c0-eb2e7aef34f9-config-data\") pod \"watcher-applier-0\" (UID: \"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9\") " pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.363335 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrv7f\" (UniqueName: \"kubernetes.io/projected/dc2497ce-b7ed-481e-88c0-eb2e7aef34f9-kube-api-access-mrv7f\") pod \"watcher-applier-0\" (UID: \"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9\") " pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.363427 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2497ce-b7ed-481e-88c0-eb2e7aef34f9-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9\") " pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.465501 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc2497ce-b7ed-481e-88c0-eb2e7aef34f9-logs\") pod \"watcher-applier-0\" (UID: \"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9\") " pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.465547 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2497ce-b7ed-481e-88c0-eb2e7aef34f9-config-data\") pod \"watcher-applier-0\" (UID: \"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9\") " pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.465572 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrv7f\" (UniqueName: \"kubernetes.io/projected/dc2497ce-b7ed-481e-88c0-eb2e7aef34f9-kube-api-access-mrv7f\") pod \"watcher-applier-0\" (UID: \"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9\") " pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.465688 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2497ce-b7ed-481e-88c0-eb2e7aef34f9-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9\") " pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.465981 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc2497ce-b7ed-481e-88c0-eb2e7aef34f9-logs\") pod \"watcher-applier-0\" (UID: \"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9\") " pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.472203 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc2497ce-b7ed-481e-88c0-eb2e7aef34f9-config-data\") pod \"watcher-applier-0\" (UID: \"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9\") " pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.497321 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc2497ce-b7ed-481e-88c0-eb2e7aef34f9-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9\") " pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.501608 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrv7f\" (UniqueName: \"kubernetes.io/projected/dc2497ce-b7ed-481e-88c0-eb2e7aef34f9-kube-api-access-mrv7f\") pod \"watcher-applier-0\" (UID: \"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9\") " pod="openstack/watcher-applier-0" Oct 13 18:32:02 crc kubenswrapper[4974]: I1013 18:32:02.528699 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Oct 13 18:32:03 crc kubenswrapper[4974]: I1013 18:32:03.247843 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:32:03 crc kubenswrapper[4974]: I1013 18:32:03.321156 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-549f94c95-vzgxw"] Oct 13 18:32:03 crc kubenswrapper[4974]: I1013 18:32:03.325979 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" podUID="1fec344f-2e1a-4913-a239-c891d311e830" containerName="dnsmasq-dns" containerID="cri-o://4a9bc4ff2f672bacc4a8f4f874949ccc353370d3bdef6a3474382ba992d8b89b" gracePeriod=10 Oct 13 18:32:03 crc kubenswrapper[4974]: I1013 18:32:03.824337 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="751ed806-3991-4b72-91fb-ff5d56c66849" path="/var/lib/kubelet/pods/751ed806-3991-4b72-91fb-ff5d56c66849/volumes" Oct 13 18:32:03 crc kubenswrapper[4974]: I1013 18:32:03.825008 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bd0d0c-5130-4398-91e9-e96652bcae59" path="/var/lib/kubelet/pods/c6bd0d0c-5130-4398-91e9-e96652bcae59/volumes" Oct 13 18:32:04 crc kubenswrapper[4974]: I1013 18:32:04.002230 4974 generic.go:334] "Generic (PLEG): container finished" podID="1fec344f-2e1a-4913-a239-c891d311e830" containerID="4a9bc4ff2f672bacc4a8f4f874949ccc353370d3bdef6a3474382ba992d8b89b" exitCode=0 Oct 13 18:32:04 crc kubenswrapper[4974]: I1013 18:32:04.002274 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" event={"ID":"1fec344f-2e1a-4913-a239-c891d311e830","Type":"ContainerDied","Data":"4a9bc4ff2f672bacc4a8f4f874949ccc353370d3bdef6a3474382ba992d8b89b"} Oct 13 18:32:04 crc kubenswrapper[4974]: I1013 18:32:04.603128 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:32:04 crc kubenswrapper[4974]: I1013 18:32:04.609923 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:32:05 crc kubenswrapper[4974]: I1013 18:32:05.004147 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:32:05 crc kubenswrapper[4974]: I1013 18:32:05.017127 4974 generic.go:334] "Generic (PLEG): container finished" podID="89790087-1d9c-4278-b62f-e18a94775048" containerID="94d1eefd3d192e105822b0db71e2378748ce57d54984aef971389806e05dbccf" exitCode=1 Oct 13 18:32:05 crc kubenswrapper[4974]: I1013 18:32:05.017168 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"89790087-1d9c-4278-b62f-e18a94775048","Type":"ContainerDied","Data":"94d1eefd3d192e105822b0db71e2378748ce57d54984aef971389806e05dbccf"} Oct 13 18:32:05 crc kubenswrapper[4974]: I1013 18:32:05.017815 4974 scope.go:117] "RemoveContainer" containerID="94d1eefd3d192e105822b0db71e2378748ce57d54984aef971389806e05dbccf" Oct 13 18:32:05 crc kubenswrapper[4974]: I1013 18:32:05.072192 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:32:06 crc kubenswrapper[4974]: I1013 18:32:06.414421 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:32:06 crc kubenswrapper[4974]: I1013 18:32:06.575019 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6dcbf4cfcd-l89jc" Oct 13 18:32:06 crc kubenswrapper[4974]: I1013 18:32:06.712452 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5796767b68-9dktc"] Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.034220 4974 generic.go:334] "Generic (PLEG): container finished" podID="bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" containerID="010bc063ed964ef76d8cfee518d76c71e443363c2c2521f1ecac16bb15de3e8a" exitCode=0 Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.034389 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5796767b68-9dktc" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerName="horizon-log" containerID="cri-o://09cc0d62a371439bdad9579ae7fbafbb030801916febc2419d1d87c924749bdc" gracePeriod=30 Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.034454 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-njgsh" event={"ID":"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3","Type":"ContainerDied","Data":"010bc063ed964ef76d8cfee518d76c71e443363c2c2521f1ecac16bb15de3e8a"} Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.034851 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5796767b68-9dktc" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerName="horizon" containerID="cri-o://c71f94d566debc81c03ff7c3180f22e17d2ee5689e9730d35d2824e22263bd4e" gracePeriod=30 Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.176495 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" podUID="1fec344f-2e1a-4913-a239-c891d311e830" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.169:5353: connect: connection refused" Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.742582 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.742674 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.742732 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.743550 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"171b1edc0ad6306edaf67441f6ae19fb0da0e0db23e98eea0abca2248299e8ae"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.743626 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://171b1edc0ad6306edaf67441f6ae19fb0da0e0db23e98eea0abca2248299e8ae" gracePeriod=600 Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.841761 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.941768 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-cf595f5c8-dtfck" Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.995443 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7d5467cfd-m6tpn"] Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.995614 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7d5467cfd-m6tpn" podUID="64946947-a861-4b5b-a016-102ced68c4b4" containerName="barbican-api-log" containerID="cri-o://769bd83937464a78d16b6ccd0c10f94e0f1cbd354f2fbab5a4a538773fc255ac" gracePeriod=30 Oct 13 18:32:07 crc kubenswrapper[4974]: I1013 18:32:07.996216 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7d5467cfd-m6tpn" podUID="64946947-a861-4b5b-a016-102ced68c4b4" containerName="barbican-api" containerID="cri-o://8d558a53da031efb37d69bc361855430a83cfb892a4e39edc9706487e0c2624c" gracePeriod=30 Oct 13 18:32:08 crc kubenswrapper[4974]: I1013 18:32:08.046214 4974 generic.go:334] "Generic (PLEG): container finished" podID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerID="c71f94d566debc81c03ff7c3180f22e17d2ee5689e9730d35d2824e22263bd4e" exitCode=0 Oct 13 18:32:08 crc kubenswrapper[4974]: I1013 18:32:08.046305 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5796767b68-9dktc" event={"ID":"a9b88447-1cdb-4666-a4c2-31b7a0e7192f","Type":"ContainerDied","Data":"c71f94d566debc81c03ff7c3180f22e17d2ee5689e9730d35d2824e22263bd4e"} Oct 13 18:32:08 crc kubenswrapper[4974]: I1013 18:32:08.048370 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="171b1edc0ad6306edaf67441f6ae19fb0da0e0db23e98eea0abca2248299e8ae" exitCode=0 Oct 13 18:32:08 crc kubenswrapper[4974]: I1013 18:32:08.048430 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"171b1edc0ad6306edaf67441f6ae19fb0da0e0db23e98eea0abca2248299e8ae"} Oct 13 18:32:08 crc kubenswrapper[4974]: I1013 18:32:08.417455 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 13 18:32:08 crc kubenswrapper[4974]: I1013 18:32:08.418229 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 13 18:32:08 crc kubenswrapper[4974]: I1013 18:32:08.885626 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7d5467cfd-m6tpn" podUID="64946947-a861-4b5b-a016-102ced68c4b4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.177:9311/healthcheck\": read tcp 10.217.0.2:56334->10.217.0.177:9311: read: connection reset by peer" Oct 13 18:32:08 crc kubenswrapper[4974]: I1013 18:32:08.885947 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7d5467cfd-m6tpn" podUID="64946947-a861-4b5b-a016-102ced68c4b4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.177:9311/healthcheck\": read tcp 10.217.0.2:56326->10.217.0.177:9311: read: connection reset by peer" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.065600 4974 generic.go:334] "Generic (PLEG): container finished" podID="ab455ee5-f8bd-4d4e-b179-524fa3edcc52" containerID="ff5dec3b18644db01f42beb495cb595475713647a9fa6ae84f9caa9404dd6b33" exitCode=0 Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.065678 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jgthj" event={"ID":"ab455ee5-f8bd-4d4e-b179-524fa3edcc52","Type":"ContainerDied","Data":"ff5dec3b18644db01f42beb495cb595475713647a9fa6ae84f9caa9404dd6b33"} Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.074718 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-njgsh" event={"ID":"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3","Type":"ContainerDied","Data":"870a44faa56d8f5c1e61769f7df75b65be2735c3166c19127fada16ceebc1c05"} Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.074763 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="870a44faa56d8f5c1e61769f7df75b65be2735c3166c19127fada16ceebc1c05" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.076954 4974 generic.go:334] "Generic (PLEG): container finished" podID="64946947-a861-4b5b-a016-102ced68c4b4" containerID="8d558a53da031efb37d69bc361855430a83cfb892a4e39edc9706487e0c2624c" exitCode=0 Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.076981 4974 generic.go:334] "Generic (PLEG): container finished" podID="64946947-a861-4b5b-a016-102ced68c4b4" containerID="769bd83937464a78d16b6ccd0c10f94e0f1cbd354f2fbab5a4a538773fc255ac" exitCode=143 Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.077001 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d5467cfd-m6tpn" event={"ID":"64946947-a861-4b5b-a016-102ced68c4b4","Type":"ContainerDied","Data":"8d558a53da031efb37d69bc361855430a83cfb892a4e39edc9706487e0c2624c"} Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.077022 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d5467cfd-m6tpn" event={"ID":"64946947-a861-4b5b-a016-102ced68c4b4","Type":"ContainerDied","Data":"769bd83937464a78d16b6ccd0c10f94e0f1cbd354f2fbab5a4a538773fc255ac"} Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.173520 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-njgsh" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.243973 4974 scope.go:117] "RemoveContainer" containerID="3991c5ce67933228abbdade41680d3fa186d322e2a0ef27f9e529aa4b15eb37f" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.326337 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-scripts\") pod \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.326881 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-db-sync-config-data\") pod \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.326929 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-config-data\") pod \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.326962 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-etc-machine-id\") pod \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.327004 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbb85\" (UniqueName: \"kubernetes.io/projected/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-kube-api-access-zbb85\") pod \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.327085 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-combined-ca-bundle\") pod \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\" (UID: \"bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.327497 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" (UID: "bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.330520 4974 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.333844 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-scripts" (OuterVolumeSpecName: "scripts") pod "bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" (UID: "bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.334822 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-kube-api-access-zbb85" (OuterVolumeSpecName: "kube-api-access-zbb85") pod "bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" (UID: "bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3"). InnerVolumeSpecName "kube-api-access-zbb85". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.334884 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" (UID: "bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.357472 4974 scope.go:117] "RemoveContainer" containerID="9f3538a3336ad09da6e2d412fce7099b1f11061277d16c5d59fe29b11072dbbc" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.382542 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" (UID: "bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.432208 4974 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.432401 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbb85\" (UniqueName: \"kubernetes.io/projected/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-kube-api-access-zbb85\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.432464 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.432516 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.439106 4974 scope.go:117] "RemoveContainer" containerID="436597ac77fb62acd2a6755e030d52c331e47424c1685e9a911b5a1473f796ca" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.472019 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-config-data" (OuterVolumeSpecName: "config-data") pod "bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" (UID: "bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.492882 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.544923 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-dns-swift-storage-0\") pod \"1fec344f-2e1a-4913-a239-c891d311e830\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.545343 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-dns-svc\") pod \"1fec344f-2e1a-4913-a239-c891d311e830\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.545443 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-ovsdbserver-sb\") pod \"1fec344f-2e1a-4913-a239-c891d311e830\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.545556 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-config\") pod \"1fec344f-2e1a-4913-a239-c891d311e830\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.545634 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-ovsdbserver-nb\") pod \"1fec344f-2e1a-4913-a239-c891d311e830\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.545730 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4cnf\" (UniqueName: \"kubernetes.io/projected/1fec344f-2e1a-4913-a239-c891d311e830-kube-api-access-n4cnf\") pod \"1fec344f-2e1a-4913-a239-c891d311e830\" (UID: \"1fec344f-2e1a-4913-a239-c891d311e830\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.546179 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.581138 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fec344f-2e1a-4913-a239-c891d311e830-kube-api-access-n4cnf" (OuterVolumeSpecName: "kube-api-access-n4cnf") pod "1fec344f-2e1a-4913-a239-c891d311e830" (UID: "1fec344f-2e1a-4913-a239-c891d311e830"). InnerVolumeSpecName "kube-api-access-n4cnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.650209 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4cnf\" (UniqueName: \"kubernetes.io/projected/1fec344f-2e1a-4913-a239-c891d311e830-kube-api-access-n4cnf\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.656735 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1fec344f-2e1a-4913-a239-c891d311e830" (UID: "1fec344f-2e1a-4913-a239-c891d311e830"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.721839 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1fec344f-2e1a-4913-a239-c891d311e830" (UID: "1fec344f-2e1a-4913-a239-c891d311e830"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.755399 4974 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.755432 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.809279 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1fec344f-2e1a-4913-a239-c891d311e830" (UID: "1fec344f-2e1a-4913-a239-c891d311e830"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.810801 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-config" (OuterVolumeSpecName: "config") pod "1fec344f-2e1a-4913-a239-c891d311e830" (UID: "1fec344f-2e1a-4913-a239-c891d311e830"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.864373 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.864402 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.868247 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.882549 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.883532 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1fec344f-2e1a-4913-a239-c891d311e830" (UID: "1fec344f-2e1a-4913-a239-c891d311e830"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: E1013 18:32:09.934919 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"ceilometer-notification-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack/ceilometer-0" podUID="300c099d-eca4-4f0c-a79f-dde4dddd8a98" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.968282 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64946947-a861-4b5b-a016-102ced68c4b4-logs\") pod \"64946947-a861-4b5b-a016-102ced68c4b4\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.968401 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-config-data-custom\") pod \"64946947-a861-4b5b-a016-102ced68c4b4\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.968519 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-combined-ca-bundle\") pod \"64946947-a861-4b5b-a016-102ced68c4b4\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.968573 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5srs\" (UniqueName: \"kubernetes.io/projected/64946947-a861-4b5b-a016-102ced68c4b4-kube-api-access-m5srs\") pod \"64946947-a861-4b5b-a016-102ced68c4b4\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.968667 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-config-data\") pod \"64946947-a861-4b5b-a016-102ced68c4b4\" (UID: \"64946947-a861-4b5b-a016-102ced68c4b4\") " Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.968997 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64946947-a861-4b5b-a016-102ced68c4b4-logs" (OuterVolumeSpecName: "logs") pod "64946947-a861-4b5b-a016-102ced68c4b4" (UID: "64946947-a861-4b5b-a016-102ced68c4b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.969380 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64946947-a861-4b5b-a016-102ced68c4b4-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.969408 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1fec344f-2e1a-4913-a239-c891d311e830-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.972274 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64946947-a861-4b5b-a016-102ced68c4b4-kube-api-access-m5srs" (OuterVolumeSpecName: "kube-api-access-m5srs") pod "64946947-a861-4b5b-a016-102ced68c4b4" (UID: "64946947-a861-4b5b-a016-102ced68c4b4"). InnerVolumeSpecName "kube-api-access-m5srs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.973276 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "64946947-a861-4b5b-a016-102ced68c4b4" (UID: "64946947-a861-4b5b-a016-102ced68c4b4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:09 crc kubenswrapper[4974]: I1013 18:32:09.997255 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64946947-a861-4b5b-a016-102ced68c4b4" (UID: "64946947-a861-4b5b-a016-102ced68c4b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.034386 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-config-data" (OuterVolumeSpecName: "config-data") pod "64946947-a861-4b5b-a016-102ced68c4b4" (UID: "64946947-a861-4b5b-a016-102ced68c4b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.070849 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5srs\" (UniqueName: \"kubernetes.io/projected/64946947-a861-4b5b-a016-102ced68c4b4-kube-api-access-m5srs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.070891 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.070904 4974 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.070915 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64946947-a861-4b5b-a016-102ced68c4b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.090300 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"300c099d-eca4-4f0c-a79f-dde4dddd8a98","Type":"ContainerStarted","Data":"9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887"} Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.090721 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.090888 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="300c099d-eca4-4f0c-a79f-dde4dddd8a98" containerName="sg-core" containerID="cri-o://817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7" gracePeriod=30 Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.091138 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="300c099d-eca4-4f0c-a79f-dde4dddd8a98" containerName="proxy-httpd" containerID="cri-o://9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887" gracePeriod=30 Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.096311 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9","Type":"ContainerStarted","Data":"2dc2efe075269b9d3785334d6f93e43edef96348f963eb80fa91526060ca4fee"} Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.096357 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"dc2497ce-b7ed-481e-88c0-eb2e7aef34f9","Type":"ContainerStarted","Data":"0b77fd3557fd7d0db594d22da7d4cca7cc78cd1e7544cdd01b916b9f54877964"} Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.100127 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.100124 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-549f94c95-vzgxw" event={"ID":"1fec344f-2e1a-4913-a239-c891d311e830","Type":"ContainerDied","Data":"cee2e569c0f2e92c7f1667bb862547dd18338bd1680d93cfb5d276300207cf04"} Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.100293 4974 scope.go:117] "RemoveContainer" containerID="4a9bc4ff2f672bacc4a8f4f874949ccc353370d3bdef6a3474382ba992d8b89b" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.108904 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"89790087-1d9c-4278-b62f-e18a94775048","Type":"ContainerStarted","Data":"e4ce9d5b57598d4e4ce005dd5f090268be817265c2f0db6be413ed342cbf8e56"} Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.118465 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.124827 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7d5467cfd-m6tpn" event={"ID":"64946947-a861-4b5b-a016-102ced68c4b4","Type":"ContainerDied","Data":"ed6372e23a7facf06ddebe859f7f0ebe70fb9505e4ed3e0a9b537c12bfdf1294"} Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.124932 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7d5467cfd-m6tpn" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.131730 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"5a8943cc34f896ee3b60b9837ff3add67567a5a52b9b5a1adbb600a9ed07e274"} Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.131755 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-njgsh" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.140271 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=8.140251928 podStartE2EDuration="8.140251928s" podCreationTimestamp="2025-10-13 18:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:10.137309005 +0000 UTC m=+1065.041675095" watchObservedRunningTime="2025-10-13 18:32:10.140251928 +0000 UTC m=+1065.044618008" Oct 13 18:32:10 crc kubenswrapper[4974]: W1013 18:32:10.141191 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5c387632_008a_4609_8b64_ff84c35596c7.slice/crio-bff4936fab641ce4a41e52dfd14868f180cbaed3c47968ef7950e26677055630 WatchSource:0}: Error finding container bff4936fab641ce4a41e52dfd14868f180cbaed3c47968ef7950e26677055630: Status 404 returned error can't find the container with id bff4936fab641ce4a41e52dfd14868f180cbaed3c47968ef7950e26677055630 Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.310872 4974 scope.go:117] "RemoveContainer" containerID="e3dab4edeab41c1ba53393c31f9813138fa97182384ef3583e979f98e808399d" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.391738 4974 scope.go:117] "RemoveContainer" containerID="8d558a53da031efb37d69bc361855430a83cfb892a4e39edc9706487e0c2624c" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.449870 4974 scope.go:117] "RemoveContainer" containerID="769bd83937464a78d16b6ccd0c10f94e0f1cbd354f2fbab5a4a538773fc255ac" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.459729 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-549f94c95-vzgxw"] Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.501998 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-549f94c95-vzgxw"] Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.532735 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 18:32:10 crc kubenswrapper[4974]: E1013 18:32:10.533159 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" containerName="cinder-db-sync" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.533174 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" containerName="cinder-db-sync" Oct 13 18:32:10 crc kubenswrapper[4974]: E1013 18:32:10.533187 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fec344f-2e1a-4913-a239-c891d311e830" containerName="init" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.533195 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fec344f-2e1a-4913-a239-c891d311e830" containerName="init" Oct 13 18:32:10 crc kubenswrapper[4974]: E1013 18:32:10.533232 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64946947-a861-4b5b-a016-102ced68c4b4" containerName="barbican-api" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.533239 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="64946947-a861-4b5b-a016-102ced68c4b4" containerName="barbican-api" Oct 13 18:32:10 crc kubenswrapper[4974]: E1013 18:32:10.533258 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64946947-a861-4b5b-a016-102ced68c4b4" containerName="barbican-api-log" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.533267 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="64946947-a861-4b5b-a016-102ced68c4b4" containerName="barbican-api-log" Oct 13 18:32:10 crc kubenswrapper[4974]: E1013 18:32:10.533279 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fec344f-2e1a-4913-a239-c891d311e830" containerName="dnsmasq-dns" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.533289 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fec344f-2e1a-4913-a239-c891d311e830" containerName="dnsmasq-dns" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.533502 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="64946947-a861-4b5b-a016-102ced68c4b4" containerName="barbican-api-log" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.533531 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="64946947-a861-4b5b-a016-102ced68c4b4" containerName="barbican-api" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.533543 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fec344f-2e1a-4913-a239-c891d311e830" containerName="dnsmasq-dns" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.533560 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" containerName="cinder-db-sync" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.541053 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.551515 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9s2zv" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.551586 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.551700 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.563844 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.580892 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7d5467cfd-m6tpn"] Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.594542 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-config-data\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.594632 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mvwl\" (UniqueName: \"kubernetes.io/projected/687bd2ca-d010-478f-902c-76dfb378ec55-kube-api-access-4mvwl\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.594687 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-scripts\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.594850 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/687bd2ca-d010-478f-902c-76dfb378ec55-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.594900 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.594940 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.598071 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7d5467cfd-m6tpn"] Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.615626 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-899fdf8d7-pw7nj"] Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.617717 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.624491 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.635844 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-899fdf8d7-pw7nj"] Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.701343 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/687bd2ca-d010-478f-902c-76dfb378ec55-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.703945 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/687bd2ca-d010-478f-902c-76dfb378ec55-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.704374 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.704419 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.704712 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-config-data\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.704818 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mvwl\" (UniqueName: \"kubernetes.io/projected/687bd2ca-d010-478f-902c-76dfb378ec55-kube-api-access-4mvwl\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.716124 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-scripts\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.718304 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.719317 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-scripts\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.730550 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.733281 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.736759 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mvwl\" (UniqueName: \"kubernetes.io/projected/687bd2ca-d010-478f-902c-76dfb378ec55-kube-api-access-4mvwl\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.737129 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.738266 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-config-data\") pod \"cinder-scheduler-0\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.738572 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.756788 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.792359 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jgthj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.817844 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-config\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.817922 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-ovsdbserver-nb\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.817992 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-ovsdbserver-sb\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.819938 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fxml\" (UniqueName: \"kubernetes.io/projected/34f9a6fc-6ac6-4269-b227-5daa5939a207-kube-api-access-4fxml\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.819990 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-dns-swift-storage-0\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.820044 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-dns-svc\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.897912 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.907260 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.921114 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtwx7\" (UniqueName: \"kubernetes.io/projected/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-kube-api-access-gtwx7\") pod \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.921497 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-db-sync-config-data\") pod \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.921623 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-config-data\") pod \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.921826 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-combined-ca-bundle\") pod \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\" (UID: \"ab455ee5-f8bd-4d4e-b179-524fa3edcc52\") " Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.922211 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-config\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.922302 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsl68\" (UniqueName: \"kubernetes.io/projected/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-kube-api-access-wsl68\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.922393 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.922541 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-config-data-custom\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.922622 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-ovsdbserver-nb\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.922754 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-scripts\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.922853 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-logs\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.922933 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-ovsdbserver-sb\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.923039 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-config-data\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.923160 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-config\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.923246 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fxml\" (UniqueName: \"kubernetes.io/projected/34f9a6fc-6ac6-4269-b227-5daa5939a207-kube-api-access-4fxml\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.923391 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-dns-swift-storage-0\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.923482 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-dns-svc\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.923549 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.925616 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-dns-swift-storage-0\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.925824 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-ovsdbserver-nb\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.926449 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-dns-svc\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.928727 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ab455ee5-f8bd-4d4e-b179-524fa3edcc52" (UID: "ab455ee5-f8bd-4d4e-b179-524fa3edcc52"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.928924 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-kube-api-access-gtwx7" (OuterVolumeSpecName: "kube-api-access-gtwx7") pod "ab455ee5-f8bd-4d4e-b179-524fa3edcc52" (UID: "ab455ee5-f8bd-4d4e-b179-524fa3edcc52"). InnerVolumeSpecName "kube-api-access-gtwx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.929035 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-ovsdbserver-sb\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.940720 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fxml\" (UniqueName: \"kubernetes.io/projected/34f9a6fc-6ac6-4269-b227-5daa5939a207-kube-api-access-4fxml\") pod \"dnsmasq-dns-899fdf8d7-pw7nj\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.972357 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:10 crc kubenswrapper[4974]: I1013 18:32:10.983476 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab455ee5-f8bd-4d4e-b179-524fa3edcc52" (UID: "ab455ee5-f8bd-4d4e-b179-524fa3edcc52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.025487 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-config-data\") pod \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.025520 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-scripts\") pod \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.025578 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-combined-ca-bundle\") pod \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.025601 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-sg-core-conf-yaml\") pod \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.025709 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/300c099d-eca4-4f0c-a79f-dde4dddd8a98-log-httpd\") pod \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.025755 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/300c099d-eca4-4f0c-a79f-dde4dddd8a98-run-httpd\") pod \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.025803 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxrvw\" (UniqueName: \"kubernetes.io/projected/300c099d-eca4-4f0c-a79f-dde4dddd8a98-kube-api-access-sxrvw\") pod \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\" (UID: \"300c099d-eca4-4f0c-a79f-dde4dddd8a98\") " Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.025987 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-config-data\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.026032 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.026058 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsl68\" (UniqueName: \"kubernetes.io/projected/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-kube-api-access-wsl68\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.026083 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.026112 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-config-data-custom\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.026158 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-scripts\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.026184 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-logs\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.026252 4974 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.026264 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.026274 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtwx7\" (UniqueName: \"kubernetes.io/projected/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-kube-api-access-gtwx7\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.027876 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-logs\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.028456 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-etc-machine-id\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.028779 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/300c099d-eca4-4f0c-a79f-dde4dddd8a98-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "300c099d-eca4-4f0c-a79f-dde4dddd8a98" (UID: "300c099d-eca4-4f0c-a79f-dde4dddd8a98"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.029205 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/300c099d-eca4-4f0c-a79f-dde4dddd8a98-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "300c099d-eca4-4f0c-a79f-dde4dddd8a98" (UID: "300c099d-eca4-4f0c-a79f-dde4dddd8a98"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.031593 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.031898 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-config-data" (OuterVolumeSpecName: "config-data") pod "ab455ee5-f8bd-4d4e-b179-524fa3edcc52" (UID: "ab455ee5-f8bd-4d4e-b179-524fa3edcc52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.034444 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-scripts" (OuterVolumeSpecName: "scripts") pod "300c099d-eca4-4f0c-a79f-dde4dddd8a98" (UID: "300c099d-eca4-4f0c-a79f-dde4dddd8a98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.050224 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/300c099d-eca4-4f0c-a79f-dde4dddd8a98-kube-api-access-sxrvw" (OuterVolumeSpecName: "kube-api-access-sxrvw") pod "300c099d-eca4-4f0c-a79f-dde4dddd8a98" (UID: "300c099d-eca4-4f0c-a79f-dde4dddd8a98"). InnerVolumeSpecName "kube-api-access-sxrvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.051735 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-config-data\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.051879 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-scripts\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.053582 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-config-data-custom\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.060808 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsl68\" (UniqueName: \"kubernetes.io/projected/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-kube-api-access-wsl68\") pod \"cinder-api-0\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.078246 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.091147 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "300c099d-eca4-4f0c-a79f-dde4dddd8a98" (UID: "300c099d-eca4-4f0c-a79f-dde4dddd8a98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.128351 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab455ee5-f8bd-4d4e-b179-524fa3edcc52-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.128588 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxrvw\" (UniqueName: \"kubernetes.io/projected/300c099d-eca4-4f0c-a79f-dde4dddd8a98-kube-api-access-sxrvw\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.128600 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.128610 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.128619 4974 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/300c099d-eca4-4f0c-a79f-dde4dddd8a98-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.128626 4974 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/300c099d-eca4-4f0c-a79f-dde4dddd8a98-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.143161 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "300c099d-eca4-4f0c-a79f-dde4dddd8a98" (UID: "300c099d-eca4-4f0c-a79f-dde4dddd8a98"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.173919 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-jgthj" event={"ID":"ab455ee5-f8bd-4d4e-b179-524fa3edcc52","Type":"ContainerDied","Data":"87be49189833f50e15e2f4ef25769f6d16a9f57a9dca906d77c8edb3cfa15611"} Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.173952 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87be49189833f50e15e2f4ef25769f6d16a9f57a9dca906d77c8edb3cfa15611" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.174025 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-jgthj" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.176936 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5c387632-008a-4609-8b64-ff84c35596c7","Type":"ContainerStarted","Data":"ed1cba1a0c46a67924547f0441199b0d1e457b523a0cf4526d5f63d676822daf"} Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.176972 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5c387632-008a-4609-8b64-ff84c35596c7","Type":"ContainerStarted","Data":"d58f0f9d07db4ef0bc9c965bbe8af1d97fa8784b1da5dc1926fd66483114c341"} Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.176982 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"5c387632-008a-4609-8b64-ff84c35596c7","Type":"ContainerStarted","Data":"bff4936fab641ce4a41e52dfd14868f180cbaed3c47968ef7950e26677055630"} Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.179240 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.181884 4974 generic.go:334] "Generic (PLEG): container finished" podID="300c099d-eca4-4f0c-a79f-dde4dddd8a98" containerID="9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887" exitCode=0 Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.181921 4974 generic.go:334] "Generic (PLEG): container finished" podID="300c099d-eca4-4f0c-a79f-dde4dddd8a98" containerID="817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7" exitCode=2 Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.182328 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"300c099d-eca4-4f0c-a79f-dde4dddd8a98","Type":"ContainerDied","Data":"9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887"} Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.182356 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"300c099d-eca4-4f0c-a79f-dde4dddd8a98","Type":"ContainerDied","Data":"817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7"} Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.182365 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"300c099d-eca4-4f0c-a79f-dde4dddd8a98","Type":"ContainerDied","Data":"e7ae00adc6008de72d279c24b5e188e9839442bb1950329bd992e04f65d16807"} Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.182381 4974 scope.go:117] "RemoveContainer" containerID="9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.182502 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.189219 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="5c387632-008a-4609-8b64-ff84c35596c7" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.180:9322/\": dial tcp 10.217.0.180:9322: connect: connection refused" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.237805 4974 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.238756 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=10.238744763 podStartE2EDuration="10.238744763s" podCreationTimestamp="2025-10-13 18:32:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:11.207668347 +0000 UTC m=+1066.112034427" watchObservedRunningTime="2025-10-13 18:32:11.238744763 +0000 UTC m=+1066.143110843" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.251739 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-config-data" (OuterVolumeSpecName: "config-data") pod "300c099d-eca4-4f0c-a79f-dde4dddd8a98" (UID: "300c099d-eca4-4f0c-a79f-dde4dddd8a98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.282993 4974 scope.go:117] "RemoveContainer" containerID="817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.353319 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/300c099d-eca4-4f0c-a79f-dde4dddd8a98-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.381549 4974 scope.go:117] "RemoveContainer" containerID="9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887" Oct 13 18:32:11 crc kubenswrapper[4974]: E1013 18:32:11.388855 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887\": container with ID starting with 9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887 not found: ID does not exist" containerID="9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.388911 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887"} err="failed to get container status \"9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887\": rpc error: code = NotFound desc = could not find container \"9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887\": container with ID starting with 9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887 not found: ID does not exist" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.388937 4974 scope.go:117] "RemoveContainer" containerID="817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7" Oct 13 18:32:11 crc kubenswrapper[4974]: E1013 18:32:11.390747 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7\": container with ID starting with 817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7 not found: ID does not exist" containerID="817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.390786 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7"} err="failed to get container status \"817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7\": rpc error: code = NotFound desc = could not find container \"817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7\": container with ID starting with 817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7 not found: ID does not exist" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.390811 4974 scope.go:117] "RemoveContainer" containerID="9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.393052 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887"} err="failed to get container status \"9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887\": rpc error: code = NotFound desc = could not find container \"9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887\": container with ID starting with 9ba0a5e40b77765d24f9973a4964a90cb2bd1b794707aa80accd0a07fdef2887 not found: ID does not exist" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.393080 4974 scope.go:117] "RemoveContainer" containerID="817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.393979 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7"} err="failed to get container status \"817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7\": rpc error: code = NotFound desc = could not find container \"817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7\": container with ID starting with 817a721d9524b661c64e3eaf34a0472424ff976bdd47f803c9aeaf0a184a3db7 not found: ID does not exist" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.475501 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.613842 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.661640 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.677629 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:11 crc kubenswrapper[4974]: E1013 18:32:11.678102 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300c099d-eca4-4f0c-a79f-dde4dddd8a98" containerName="sg-core" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.678115 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="300c099d-eca4-4f0c-a79f-dde4dddd8a98" containerName="sg-core" Oct 13 18:32:11 crc kubenswrapper[4974]: E1013 18:32:11.678135 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab455ee5-f8bd-4d4e-b179-524fa3edcc52" containerName="glance-db-sync" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.678141 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab455ee5-f8bd-4d4e-b179-524fa3edcc52" containerName="glance-db-sync" Oct 13 18:32:11 crc kubenswrapper[4974]: E1013 18:32:11.678152 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="300c099d-eca4-4f0c-a79f-dde4dddd8a98" containerName="proxy-httpd" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.678158 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="300c099d-eca4-4f0c-a79f-dde4dddd8a98" containerName="proxy-httpd" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.678326 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5796767b68-9dktc" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.678343 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="300c099d-eca4-4f0c-a79f-dde4dddd8a98" containerName="sg-core" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.678408 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="300c099d-eca4-4f0c-a79f-dde4dddd8a98" containerName="proxy-httpd" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.678444 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab455ee5-f8bd-4d4e-b179-524fa3edcc52" containerName="glance-db-sync" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.680220 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.683357 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.685543 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.686911 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-899fdf8d7-pw7nj"] Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.711544 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.724738 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dd5946f99-q7kth"] Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.727635 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.737013 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dd5946f99-q7kth"] Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.759004 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-899fdf8d7-pw7nj"] Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763308 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-dns-svc\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763362 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e27cf425-7780-4d59-b887-ec5da0a83a71-run-httpd\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763423 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e27cf425-7780-4d59-b887-ec5da0a83a71-log-httpd\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763469 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9pnt\" (UniqueName: \"kubernetes.io/projected/e27cf425-7780-4d59-b887-ec5da0a83a71-kube-api-access-p9pnt\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763500 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md9dr\" (UniqueName: \"kubernetes.io/projected/33807b55-8fc9-44dc-9639-b633bd748101-kube-api-access-md9dr\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763517 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-config\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763553 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-ovsdbserver-sb\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763573 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-dns-swift-storage-0\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763629 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763774 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-scripts\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763821 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763859 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-config-data\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.763889 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-ovsdbserver-nb\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.864906 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-dns-svc\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.864950 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e27cf425-7780-4d59-b887-ec5da0a83a71-run-httpd\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.864997 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e27cf425-7780-4d59-b887-ec5da0a83a71-log-httpd\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.865022 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9pnt\" (UniqueName: \"kubernetes.io/projected/e27cf425-7780-4d59-b887-ec5da0a83a71-kube-api-access-p9pnt\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.865049 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md9dr\" (UniqueName: \"kubernetes.io/projected/33807b55-8fc9-44dc-9639-b633bd748101-kube-api-access-md9dr\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.865064 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-config\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.865078 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-ovsdbserver-sb\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.865097 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-dns-swift-storage-0\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.865145 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.865182 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-scripts\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.865231 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.865251 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-config-data\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.865280 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-ovsdbserver-nb\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.865752 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-dns-svc\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.865974 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-ovsdbserver-nb\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.866294 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-dns-swift-storage-0\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.866588 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e27cf425-7780-4d59-b887-ec5da0a83a71-run-httpd\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.866830 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e27cf425-7780-4d59-b887-ec5da0a83a71-log-httpd\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.868756 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-config\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.875020 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.875259 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-config-data\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.877320 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.880000 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-ovsdbserver-sb\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.880761 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-scripts\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.881730 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fec344f-2e1a-4913-a239-c891d311e830" path="/var/lib/kubelet/pods/1fec344f-2e1a-4913-a239-c891d311e830/volumes" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.882500 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="300c099d-eca4-4f0c-a79f-dde4dddd8a98" path="/var/lib/kubelet/pods/300c099d-eca4-4f0c-a79f-dde4dddd8a98/volumes" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.883231 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64946947-a861-4b5b-a016-102ced68c4b4" path="/var/lib/kubelet/pods/64946947-a861-4b5b-a016-102ced68c4b4/volumes" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.909133 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md9dr\" (UniqueName: \"kubernetes.io/projected/33807b55-8fc9-44dc-9639-b633bd748101-kube-api-access-md9dr\") pod \"dnsmasq-dns-dd5946f99-q7kth\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.914511 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9pnt\" (UniqueName: \"kubernetes.io/projected/e27cf425-7780-4d59-b887-ec5da0a83a71-kube-api-access-p9pnt\") pod \"ceilometer-0\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " pod="openstack/ceilometer-0" Oct 13 18:32:11 crc kubenswrapper[4974]: I1013 18:32:11.924624 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.015438 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.056147 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.255088 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" event={"ID":"34f9a6fc-6ac6-4269-b227-5daa5939a207","Type":"ContainerStarted","Data":"cfb346f6bce5283cd7c2bb371c5b2d97bc7fb4f9a5e5397cfee1feb0af648637"} Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.259686 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de","Type":"ContainerStarted","Data":"ea3598df805e354e2919381889edd9c87b11d19feffc3c1c7aad1c6b86a098f7"} Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.318073 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"687bd2ca-d010-478f-902c-76dfb378ec55","Type":"ContainerStarted","Data":"617f5958a0ac4e47139e383188014a02b215c8c17f45b8c01441f17ed8fb3991"} Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.373907 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.374333 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.423954 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.535270 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.538982 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.540090 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.540128 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.550104 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.550378 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-nt8rf" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.550746 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.621317 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.660794 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.776251 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-scripts\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.776311 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-config-data\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.776333 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.776393 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/807d94ac-9bfc-465c-b60e-bb314831cdb1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.776422 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/807d94ac-9bfc-465c-b60e-bb314831cdb1-logs\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.776450 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxxsc\" (UniqueName: \"kubernetes.io/projected/807d94ac-9bfc-465c-b60e-bb314831cdb1-kube-api-access-lxxsc\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.776471 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.878084 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/807d94ac-9bfc-465c-b60e-bb314831cdb1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.878142 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/807d94ac-9bfc-465c-b60e-bb314831cdb1-logs\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.878179 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxxsc\" (UniqueName: \"kubernetes.io/projected/807d94ac-9bfc-465c-b60e-bb314831cdb1-kube-api-access-lxxsc\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.878200 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.878261 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-scripts\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.878299 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-config-data\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.878315 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.878607 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.883385 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/807d94ac-9bfc-465c-b60e-bb314831cdb1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.883612 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/807d94ac-9bfc-465c-b60e-bb314831cdb1-logs\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.883667 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.901485 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.901769 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-config-data\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.902381 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.907800 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxxsc\" (UniqueName: \"kubernetes.io/projected/807d94ac-9bfc-465c-b60e-bb314831cdb1-kube-api-access-lxxsc\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.911953 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-scripts\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.912068 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.932272 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:12 crc kubenswrapper[4974]: I1013 18:32:12.978630 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.005001 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.081716 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p649n\" (UniqueName: \"kubernetes.io/projected/ff5909ff-629e-4b68-bac5-2627113a0809-kube-api-access-p649n\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.081767 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff5909ff-629e-4b68-bac5-2627113a0809-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.081804 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff5909ff-629e-4b68-bac5-2627113a0809-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.081846 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.081885 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.081945 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.081963 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.184001 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p649n\" (UniqueName: \"kubernetes.io/projected/ff5909ff-629e-4b68-bac5-2627113a0809-kube-api-access-p649n\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.184047 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff5909ff-629e-4b68-bac5-2627113a0809-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.186621 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff5909ff-629e-4b68-bac5-2627113a0809-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.186755 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.186935 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.188497 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.188527 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.188743 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.198252 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff5909ff-629e-4b68-bac5-2627113a0809-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.198482 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff5909ff-629e-4b68-bac5-2627113a0809-logs\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.234199 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-config-data\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.242393 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.253243 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-scripts\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.253321 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.275072 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p649n\" (UniqueName: \"kubernetes.io/projected/ff5909ff-629e-4b68-bac5-2627113a0809-kube-api-access-p649n\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.280630 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dd5946f99-q7kth"] Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.297589 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: W1013 18:32:13.299264 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33807b55_8fc9_44dc_9639_b633bd748101.slice/crio-47e5f8cb902205fb77a7f4c4c551bb7a6df12a6648997cd78370f29aaa374cbd WatchSource:0}: Error finding container 47e5f8cb902205fb77a7f4c4c551bb7a6df12a6648997cd78370f29aaa374cbd: Status 404 returned error can't find the container with id 47e5f8cb902205fb77a7f4c4c551bb7a6df12a6648997cd78370f29aaa374cbd Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.375899 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e27cf425-7780-4d59-b887-ec5da0a83a71","Type":"ContainerStarted","Data":"df6851fa5f3e7c72b166f22308f279cef82b2096c77045bd2867b3e13b62608f"} Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.382757 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/watcher-api-0" podUID="5c387632-008a-4609-8b64-ff84c35596c7" containerName="watcher-api-log" probeResult="failure" output="Get \"https://10.217.0.180:9322/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.384608 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" event={"ID":"33807b55-8fc9-44dc-9639-b633bd748101","Type":"ContainerStarted","Data":"47e5f8cb902205fb77a7f4c4c551bb7a6df12a6648997cd78370f29aaa374cbd"} Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.386837 4974 generic.go:334] "Generic (PLEG): container finished" podID="34f9a6fc-6ac6-4269-b227-5daa5939a207" containerID="85dd670ae97432451f35231164dae0a834d1e9813e340e891d02fa1e1460b025" exitCode=0 Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.388105 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" event={"ID":"34f9a6fc-6ac6-4269-b227-5daa5939a207","Type":"ContainerDied","Data":"85dd670ae97432451f35231164dae0a834d1e9813e340e891d02fa1e1460b025"} Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.499180 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.559043 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:13 crc kubenswrapper[4974]: I1013 18:32:13.909092 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.021478 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-config\") pod \"34f9a6fc-6ac6-4269-b227-5daa5939a207\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.021551 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-dns-svc\") pod \"34f9a6fc-6ac6-4269-b227-5daa5939a207\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.021744 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fxml\" (UniqueName: \"kubernetes.io/projected/34f9a6fc-6ac6-4269-b227-5daa5939a207-kube-api-access-4fxml\") pod \"34f9a6fc-6ac6-4269-b227-5daa5939a207\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.021801 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-ovsdbserver-nb\") pod \"34f9a6fc-6ac6-4269-b227-5daa5939a207\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.021864 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-ovsdbserver-sb\") pod \"34f9a6fc-6ac6-4269-b227-5daa5939a207\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.021906 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-dns-swift-storage-0\") pod \"34f9a6fc-6ac6-4269-b227-5daa5939a207\" (UID: \"34f9a6fc-6ac6-4269-b227-5daa5939a207\") " Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.100031 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "34f9a6fc-6ac6-4269-b227-5daa5939a207" (UID: "34f9a6fc-6ac6-4269-b227-5daa5939a207"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.116580 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f9a6fc-6ac6-4269-b227-5daa5939a207-kube-api-access-4fxml" (OuterVolumeSpecName: "kube-api-access-4fxml") pod "34f9a6fc-6ac6-4269-b227-5daa5939a207" (UID: "34f9a6fc-6ac6-4269-b227-5daa5939a207"). InnerVolumeSpecName "kube-api-access-4fxml". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.123920 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fxml\" (UniqueName: \"kubernetes.io/projected/34f9a6fc-6ac6-4269-b227-5daa5939a207-kube-api-access-4fxml\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.123948 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.130610 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.159799 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-config" (OuterVolumeSpecName: "config") pod "34f9a6fc-6ac6-4269-b227-5daa5939a207" (UID: "34f9a6fc-6ac6-4269-b227-5daa5939a207"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.183204 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "34f9a6fc-6ac6-4269-b227-5daa5939a207" (UID: "34f9a6fc-6ac6-4269-b227-5daa5939a207"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.212109 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34f9a6fc-6ac6-4269-b227-5daa5939a207" (UID: "34f9a6fc-6ac6-4269-b227-5daa5939a207"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.226199 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "34f9a6fc-6ac6-4269-b227-5daa5939a207" (UID: "34f9a6fc-6ac6-4269-b227-5daa5939a207"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.229307 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.229341 4974 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.229352 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.229360 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34f9a6fc-6ac6-4269-b227-5daa5939a207-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.294030 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.447318 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"687bd2ca-d010-478f-902c-76dfb378ec55","Type":"ContainerStarted","Data":"64ee487e15500b3439edd2816fbc3d5e29ff09a6af814ce45e9ceb0108ea31b2"} Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.449884 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" event={"ID":"34f9a6fc-6ac6-4269-b227-5daa5939a207","Type":"ContainerDied","Data":"cfb346f6bce5283cd7c2bb371c5b2d97bc7fb4f9a5e5397cfee1feb0af648637"} Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.449912 4974 scope.go:117] "RemoveContainer" containerID="85dd670ae97432451f35231164dae0a834d1e9813e340e891d02fa1e1460b025" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.450007 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-899fdf8d7-pw7nj" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.463215 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de","Type":"ContainerStarted","Data":"37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf"} Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.479256 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"807d94ac-9bfc-465c-b60e-bb314831cdb1","Type":"ContainerStarted","Data":"ca0e2d958f28064928c2f1cf13276d24422f4a02543e3eaadc2d0a37a6cdf4d2"} Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.513705 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.558527 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8475fc656f-dnpll" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.638491 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-899fdf8d7-pw7nj"] Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.671204 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e27cf425-7780-4d59-b887-ec5da0a83a71","Type":"ContainerStarted","Data":"4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59"} Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.707255 4974 generic.go:334] "Generic (PLEG): container finished" podID="33807b55-8fc9-44dc-9639-b633bd748101" containerID="4add1ee5b173309b2d755f30179584f4be4f02a53c48e99b6c034d6894024840" exitCode=0 Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.707542 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-899fdf8d7-pw7nj"] Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.707585 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.707590 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" event={"ID":"33807b55-8fc9-44dc-9639-b633bd748101","Type":"ContainerDied","Data":"4add1ee5b173309b2d755f30179584f4be4f02a53c48e99b6c034d6894024840"} Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.776855 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-579fddb58d-n5xbk"] Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.777279 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-579fddb58d-n5xbk" podUID="dec721b6-7daa-4481-8e76-df9054f32f97" containerName="neutron-api" containerID="cri-o://0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25" gracePeriod=30 Oct 13 18:32:14 crc kubenswrapper[4974]: I1013 18:32:14.777992 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-579fddb58d-n5xbk" podUID="dec721b6-7daa-4481-8e76-df9054f32f97" containerName="neutron-httpd" containerID="cri-o://0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548" gracePeriod=30 Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.779430 4974 generic.go:334] "Generic (PLEG): container finished" podID="87f47666-886f-422d-99b5-607d95d84774" containerID="86740bc2eb3d5cd5632e4fa5352a768203ef4909fd0a60731da3326e2c249a71" exitCode=137 Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.779867 4974 generic.go:334] "Generic (PLEG): container finished" podID="87f47666-886f-422d-99b5-607d95d84774" containerID="adf7a89af60f2f7ef0da4065d108699bd3d59983e4dc22e64406a5aad69652dc" exitCode=137 Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.779974 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-865cd5d4d7-6kmth" event={"ID":"87f47666-886f-422d-99b5-607d95d84774","Type":"ContainerDied","Data":"86740bc2eb3d5cd5632e4fa5352a768203ef4909fd0a60731da3326e2c249a71"} Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.780000 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-865cd5d4d7-6kmth" event={"ID":"87f47666-886f-422d-99b5-607d95d84774","Type":"ContainerDied","Data":"adf7a89af60f2f7ef0da4065d108699bd3d59983e4dc22e64406a5aad69652dc"} Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.799729 4974 generic.go:334] "Generic (PLEG): container finished" podID="e2289a84-2fbc-42b5-884a-22f9134e8e15" containerID="6190c9d6b685824a6eb6200399691cc50f068b01e3d085e4e8cfcdf7ca07a66e" exitCode=137 Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.799755 4974 generic.go:334] "Generic (PLEG): container finished" podID="e2289a84-2fbc-42b5-884a-22f9134e8e15" containerID="a55b39f55821b2a80c3742aa351267ee7f94e96e5043681512efae6f920f584a" exitCode=137 Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.799788 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-784fdb6789-wc4j6" event={"ID":"e2289a84-2fbc-42b5-884a-22f9134e8e15","Type":"ContainerDied","Data":"6190c9d6b685824a6eb6200399691cc50f068b01e3d085e4e8cfcdf7ca07a66e"} Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.799813 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-784fdb6789-wc4j6" event={"ID":"e2289a84-2fbc-42b5-884a-22f9134e8e15","Type":"ContainerDied","Data":"a55b39f55821b2a80c3742aa351267ee7f94e96e5043681512efae6f920f584a"} Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.802958 4974 generic.go:334] "Generic (PLEG): container finished" podID="1a480348-b0db-489e-be33-a93c1c6d311f" containerID="a2b12522dd03f99a07d5f8437c0839e57c2faebbb59c8d1ab0c2f6e558795e04" exitCode=137 Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.802986 4974 generic.go:334] "Generic (PLEG): container finished" podID="1a480348-b0db-489e-be33-a93c1c6d311f" containerID="d744872441ac27f8ea33b79021be8dc48a9a772584c8985c6bc37c67006261c3" exitCode=137 Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.803023 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcccd5645-2z74m" event={"ID":"1a480348-b0db-489e-be33-a93c1c6d311f","Type":"ContainerDied","Data":"a2b12522dd03f99a07d5f8437c0839e57c2faebbb59c8d1ab0c2f6e558795e04"} Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.803052 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcccd5645-2z74m" event={"ID":"1a480348-b0db-489e-be33-a93c1c6d311f","Type":"ContainerDied","Data":"d744872441ac27f8ea33b79021be8dc48a9a772584c8985c6bc37c67006261c3"} Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.804934 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e27cf425-7780-4d59-b887-ec5da0a83a71","Type":"ContainerStarted","Data":"fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611"} Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.807683 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.815465 4974 generic.go:334] "Generic (PLEG): container finished" podID="89790087-1d9c-4278-b62f-e18a94775048" containerID="e4ce9d5b57598d4e4ce005dd5f090268be817265c2f0db6be413ed342cbf8e56" exitCode=1 Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.838045 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" podStartSLOduration=4.838024883 podStartE2EDuration="4.838024883s" podCreationTimestamp="2025-10-13 18:32:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:15.826868129 +0000 UTC m=+1070.731234219" watchObservedRunningTime="2025-10-13 18:32:15.838024883 +0000 UTC m=+1070.742390963" Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.854795 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34f9a6fc-6ac6-4269-b227-5daa5939a207" path="/var/lib/kubelet/pods/34f9a6fc-6ac6-4269-b227-5daa5939a207/volumes" Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.855410 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"89790087-1d9c-4278-b62f-e18a94775048","Type":"ContainerDied","Data":"e4ce9d5b57598d4e4ce005dd5f090268be817265c2f0db6be413ed342cbf8e56"} Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.855446 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff5909ff-629e-4b68-bac5-2627113a0809","Type":"ContainerStarted","Data":"b42975a52ef37e2dfc038e0301c62c934affb90e431f45aba850a60aea7dc570"} Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.855472 4974 scope.go:117] "RemoveContainer" containerID="94d1eefd3d192e105822b0db71e2378748ce57d54984aef971389806e05dbccf" Oct 13 18:32:15 crc kubenswrapper[4974]: I1013 18:32:15.856184 4974 scope.go:117] "RemoveContainer" containerID="e4ce9d5b57598d4e4ce005dd5f090268be817265c2f0db6be413ed342cbf8e56" Oct 13 18:32:15 crc kubenswrapper[4974]: E1013 18:32:15.856426 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(89790087-1d9c-4278-b62f-e18a94775048)\"" pod="openstack/watcher-decision-engine-0" podUID="89790087-1d9c-4278-b62f-e18a94775048" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.016860 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.146302 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87f47666-886f-422d-99b5-607d95d84774-scripts\") pod \"87f47666-886f-422d-99b5-607d95d84774\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.146734 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhg7p\" (UniqueName: \"kubernetes.io/projected/87f47666-886f-422d-99b5-607d95d84774-kube-api-access-zhg7p\") pod \"87f47666-886f-422d-99b5-607d95d84774\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.146762 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87f47666-886f-422d-99b5-607d95d84774-logs\") pod \"87f47666-886f-422d-99b5-607d95d84774\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.146783 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87f47666-886f-422d-99b5-607d95d84774-config-data\") pod \"87f47666-886f-422d-99b5-607d95d84774\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.146833 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/87f47666-886f-422d-99b5-607d95d84774-horizon-secret-key\") pod \"87f47666-886f-422d-99b5-607d95d84774\" (UID: \"87f47666-886f-422d-99b5-607d95d84774\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.147822 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87f47666-886f-422d-99b5-607d95d84774-logs" (OuterVolumeSpecName: "logs") pod "87f47666-886f-422d-99b5-607d95d84774" (UID: "87f47666-886f-422d-99b5-607d95d84774"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.162784 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87f47666-886f-422d-99b5-607d95d84774-kube-api-access-zhg7p" (OuterVolumeSpecName: "kube-api-access-zhg7p") pod "87f47666-886f-422d-99b5-607d95d84774" (UID: "87f47666-886f-422d-99b5-607d95d84774"). InnerVolumeSpecName "kube-api-access-zhg7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.198759 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87f47666-886f-422d-99b5-607d95d84774-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "87f47666-886f-422d-99b5-607d95d84774" (UID: "87f47666-886f-422d-99b5-607d95d84774"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.231895 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87f47666-886f-422d-99b5-607d95d84774-scripts" (OuterVolumeSpecName: "scripts") pod "87f47666-886f-422d-99b5-607d95d84774" (UID: "87f47666-886f-422d-99b5-607d95d84774"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.234504 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87f47666-886f-422d-99b5-607d95d84774-config-data" (OuterVolumeSpecName: "config-data") pod "87f47666-886f-422d-99b5-607d95d84774" (UID: "87f47666-886f-422d-99b5-607d95d84774"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.252634 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/87f47666-886f-422d-99b5-607d95d84774-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.252679 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/87f47666-886f-422d-99b5-607d95d84774-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.252690 4974 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/87f47666-886f-422d-99b5-607d95d84774-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.252700 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/87f47666-886f-422d-99b5-607d95d84774-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.252708 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhg7p\" (UniqueName: \"kubernetes.io/projected/87f47666-886f-422d-99b5-607d95d84774-kube-api-access-zhg7p\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.413563 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.414761 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.557317 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.557898 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2289a84-2fbc-42b5-884a-22f9134e8e15-logs\") pod \"e2289a84-2fbc-42b5-884a-22f9134e8e15\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.557953 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a480348-b0db-489e-be33-a93c1c6d311f-scripts\") pod \"1a480348-b0db-489e-be33-a93c1c6d311f\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.558008 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a480348-b0db-489e-be33-a93c1c6d311f-logs\") pod \"1a480348-b0db-489e-be33-a93c1c6d311f\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.558046 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2289a84-2fbc-42b5-884a-22f9134e8e15-config-data\") pod \"e2289a84-2fbc-42b5-884a-22f9134e8e15\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.558075 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a480348-b0db-489e-be33-a93c1c6d311f-horizon-secret-key\") pod \"1a480348-b0db-489e-be33-a93c1c6d311f\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.558094 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ppvzc\" (UniqueName: \"kubernetes.io/projected/1a480348-b0db-489e-be33-a93c1c6d311f-kube-api-access-ppvzc\") pod \"1a480348-b0db-489e-be33-a93c1c6d311f\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.558115 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e2289a84-2fbc-42b5-884a-22f9134e8e15-horizon-secret-key\") pod \"e2289a84-2fbc-42b5-884a-22f9134e8e15\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.558160 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a480348-b0db-489e-be33-a93c1c6d311f-config-data\") pod \"1a480348-b0db-489e-be33-a93c1c6d311f\" (UID: \"1a480348-b0db-489e-be33-a93c1c6d311f\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.558210 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xs7nw\" (UniqueName: \"kubernetes.io/projected/e2289a84-2fbc-42b5-884a-22f9134e8e15-kube-api-access-xs7nw\") pod \"e2289a84-2fbc-42b5-884a-22f9134e8e15\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.558253 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2289a84-2fbc-42b5-884a-22f9134e8e15-scripts\") pod \"e2289a84-2fbc-42b5-884a-22f9134e8e15\" (UID: \"e2289a84-2fbc-42b5-884a-22f9134e8e15\") " Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.569913 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a480348-b0db-489e-be33-a93c1c6d311f-logs" (OuterVolumeSpecName: "logs") pod "1a480348-b0db-489e-be33-a93c1c6d311f" (UID: "1a480348-b0db-489e-be33-a93c1c6d311f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.573037 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2289a84-2fbc-42b5-884a-22f9134e8e15-kube-api-access-xs7nw" (OuterVolumeSpecName: "kube-api-access-xs7nw") pod "e2289a84-2fbc-42b5-884a-22f9134e8e15" (UID: "e2289a84-2fbc-42b5-884a-22f9134e8e15"). InnerVolumeSpecName "kube-api-access-xs7nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.576225 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a480348-b0db-489e-be33-a93c1c6d311f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1a480348-b0db-489e-be33-a93c1c6d311f" (UID: "1a480348-b0db-489e-be33-a93c1c6d311f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.583715 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2289a84-2fbc-42b5-884a-22f9134e8e15-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e2289a84-2fbc-42b5-884a-22f9134e8e15" (UID: "e2289a84-2fbc-42b5-884a-22f9134e8e15"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.584363 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2289a84-2fbc-42b5-884a-22f9134e8e15-logs" (OuterVolumeSpecName: "logs") pod "e2289a84-2fbc-42b5-884a-22f9134e8e15" (UID: "e2289a84-2fbc-42b5-884a-22f9134e8e15"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.591114 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a480348-b0db-489e-be33-a93c1c6d311f-kube-api-access-ppvzc" (OuterVolumeSpecName: "kube-api-access-ppvzc") pod "1a480348-b0db-489e-be33-a93c1c6d311f" (UID: "1a480348-b0db-489e-be33-a93c1c6d311f"). InnerVolumeSpecName "kube-api-access-ppvzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.619479 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.630211 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a480348-b0db-489e-be33-a93c1c6d311f-scripts" (OuterVolumeSpecName: "scripts") pod "1a480348-b0db-489e-be33-a93c1c6d311f" (UID: "1a480348-b0db-489e-be33-a93c1c6d311f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.650387 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2289a84-2fbc-42b5-884a-22f9134e8e15-scripts" (OuterVolumeSpecName: "scripts") pod "e2289a84-2fbc-42b5-884a-22f9134e8e15" (UID: "e2289a84-2fbc-42b5-884a-22f9134e8e15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.664054 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xs7nw\" (UniqueName: \"kubernetes.io/projected/e2289a84-2fbc-42b5-884a-22f9134e8e15-kube-api-access-xs7nw\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.664079 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e2289a84-2fbc-42b5-884a-22f9134e8e15-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.664089 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2289a84-2fbc-42b5-884a-22f9134e8e15-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.664097 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1a480348-b0db-489e-be33-a93c1c6d311f-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.664105 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1a480348-b0db-489e-be33-a93c1c6d311f-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.664114 4974 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1a480348-b0db-489e-be33-a93c1c6d311f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.664123 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ppvzc\" (UniqueName: \"kubernetes.io/projected/1a480348-b0db-489e-be33-a93c1c6d311f-kube-api-access-ppvzc\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.664131 4974 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e2289a84-2fbc-42b5-884a-22f9134e8e15-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.665317 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2289a84-2fbc-42b5-884a-22f9134e8e15-config-data" (OuterVolumeSpecName: "config-data") pod "e2289a84-2fbc-42b5-884a-22f9134e8e15" (UID: "e2289a84-2fbc-42b5-884a-22f9134e8e15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.675342 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a480348-b0db-489e-be33-a93c1c6d311f-config-data" (OuterVolumeSpecName: "config-data") pod "1a480348-b0db-489e-be33-a93c1c6d311f" (UID: "1a480348-b0db-489e-be33-a93c1c6d311f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.765924 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e2289a84-2fbc-42b5-884a-22f9134e8e15-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.765951 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1a480348-b0db-489e-be33-a93c1c6d311f-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.870498 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"807d94ac-9bfc-465c-b60e-bb314831cdb1","Type":"ContainerStarted","Data":"8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53"} Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.873430 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e27cf425-7780-4d59-b887-ec5da0a83a71","Type":"ContainerStarted","Data":"c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e"} Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.875741 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" event={"ID":"33807b55-8fc9-44dc-9639-b633bd748101","Type":"ContainerStarted","Data":"36fe83fb2a3be1c32d1dbc57d613e58f31a369a15a541a17891b27354a1de5e1"} Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.879423 4974 generic.go:334] "Generic (PLEG): container finished" podID="dec721b6-7daa-4481-8e76-df9054f32f97" containerID="0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548" exitCode=0 Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.879501 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-579fddb58d-n5xbk" event={"ID":"dec721b6-7daa-4481-8e76-df9054f32f97","Type":"ContainerDied","Data":"0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548"} Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.881802 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-865cd5d4d7-6kmth" event={"ID":"87f47666-886f-422d-99b5-607d95d84774","Type":"ContainerDied","Data":"066b37dc8014b7712dcf41b0c7df96eb4ec50feb8f127fce751cc20aad42b531"} Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.881820 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-865cd5d4d7-6kmth" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.881847 4974 scope.go:117] "RemoveContainer" containerID="86740bc2eb3d5cd5632e4fa5352a768203ef4909fd0a60731da3326e2c249a71" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.884548 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de","Type":"ContainerStarted","Data":"bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4"} Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.884711 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" containerName="cinder-api-log" containerID="cri-o://37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf" gracePeriod=30 Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.884979 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.885022 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" containerName="cinder-api" containerID="cri-o://bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4" gracePeriod=30 Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.896804 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff5909ff-629e-4b68-bac5-2627113a0809","Type":"ContainerStarted","Data":"f4433fa84e067b449392c0a1c7def6b9dfe6ecf86c9a283aafe7ceec46a8ec4a"} Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.903614 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"687bd2ca-d010-478f-902c-76dfb378ec55","Type":"ContainerStarted","Data":"4766525e68745961e7c2bbd9b68c665918efcdd1f0a79162423edca436494429"} Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.910842 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.910831384 podStartE2EDuration="6.910831384s" podCreationTimestamp="2025-10-13 18:32:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:16.910464124 +0000 UTC m=+1071.814830204" watchObservedRunningTime="2025-10-13 18:32:16.910831384 +0000 UTC m=+1071.815197454" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.919273 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-784fdb6789-wc4j6" event={"ID":"e2289a84-2fbc-42b5-884a-22f9134e8e15","Type":"ContainerDied","Data":"099dc291413330a09b1a30f43e1c80451f937b4117b405ca45d1b037b652df60"} Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.919416 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-784fdb6789-wc4j6" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.927148 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5fcccd5645-2z74m" event={"ID":"1a480348-b0db-489e-be33-a93c1c6d311f","Type":"ContainerDied","Data":"59130d1b7875579b8030802af7d30ef342a2d573bfd67c045e3031753c94b255"} Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.927254 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5fcccd5645-2z74m" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.944971 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-865cd5d4d7-6kmth"] Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.963345 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-865cd5d4d7-6kmth"] Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.969807 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.287160698 podStartE2EDuration="6.969787877s" podCreationTimestamp="2025-10-13 18:32:10 +0000 UTC" firstStartedPulling="2025-10-13 18:32:11.481238811 +0000 UTC m=+1066.385604891" lastFinishedPulling="2025-10-13 18:32:12.16386599 +0000 UTC m=+1067.068232070" observedRunningTime="2025-10-13 18:32:16.942403425 +0000 UTC m=+1071.846769525" watchObservedRunningTime="2025-10-13 18:32:16.969787877 +0000 UTC m=+1071.874153957" Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.992579 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5fcccd5645-2z74m"] Oct 13 18:32:16 crc kubenswrapper[4974]: I1013 18:32:16.998822 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5fcccd5645-2z74m"] Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.005929 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-784fdb6789-wc4j6"] Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.012143 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-784fdb6789-wc4j6"] Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.128861 4974 scope.go:117] "RemoveContainer" containerID="adf7a89af60f2f7ef0da4065d108699bd3d59983e4dc22e64406a5aad69652dc" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.281598 4974 scope.go:117] "RemoveContainer" containerID="6190c9d6b685824a6eb6200399691cc50f068b01e3d085e4e8cfcdf7ca07a66e" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.323884 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="5c387632-008a-4609-8b64-ff84c35596c7" containerName="watcher-api" probeResult="failure" output="Get \"https://10.217.0.180:9322/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.339381 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.549789 4974 scope.go:117] "RemoveContainer" containerID="a55b39f55821b2a80c3742aa351267ee7f94e96e5043681512efae6f920f584a" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.590849 4974 scope.go:117] "RemoveContainer" containerID="a2b12522dd03f99a07d5f8437c0839e57c2faebbb59c8d1ab0c2f6e558795e04" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.704682 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.787132 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-scripts\") pod \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.787178 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-config-data\") pod \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.787206 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wsl68\" (UniqueName: \"kubernetes.io/projected/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-kube-api-access-wsl68\") pod \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.787242 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-logs\") pod \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.787338 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-config-data-custom\") pod \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.787379 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-etc-machine-id\") pod \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.787435 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-combined-ca-bundle\") pod \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\" (UID: \"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de\") " Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.790990 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" (UID: "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.791286 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-logs" (OuterVolumeSpecName: "logs") pod "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" (UID: "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.795846 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" (UID: "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.796006 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-kube-api-access-wsl68" (OuterVolumeSpecName: "kube-api-access-wsl68") pod "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" (UID: "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de"). InnerVolumeSpecName "kube-api-access-wsl68". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.816825 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-scripts" (OuterVolumeSpecName: "scripts") pod "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" (UID: "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.856285 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a480348-b0db-489e-be33-a93c1c6d311f" path="/var/lib/kubelet/pods/1a480348-b0db-489e-be33-a93c1c6d311f/volumes" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.857163 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87f47666-886f-422d-99b5-607d95d84774" path="/var/lib/kubelet/pods/87f47666-886f-422d-99b5-607d95d84774/volumes" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.858110 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2289a84-2fbc-42b5-884a-22f9134e8e15" path="/var/lib/kubelet/pods/e2289a84-2fbc-42b5-884a-22f9134e8e15/volumes" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.875885 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-config-data" (OuterVolumeSpecName: "config-data") pod "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" (UID: "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.891817 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.891855 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.891865 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wsl68\" (UniqueName: \"kubernetes.io/projected/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-kube-api-access-wsl68\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.891874 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.891883 4974 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.891891 4974 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.917768 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" (UID: "eaf5ac8c-72ac-4321-817d-bb4a7bcc28de"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.959846 4974 generic.go:334] "Generic (PLEG): container finished" podID="eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" containerID="bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4" exitCode=0 Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.960094 4974 generic.go:334] "Generic (PLEG): container finished" podID="eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" containerID="37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf" exitCode=143 Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.960363 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.962828 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de","Type":"ContainerDied","Data":"bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4"} Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.962880 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de","Type":"ContainerDied","Data":"37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf"} Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.962892 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"eaf5ac8c-72ac-4321-817d-bb4a7bcc28de","Type":"ContainerDied","Data":"ea3598df805e354e2919381889edd9c87b11d19feffc3c1c7aad1c6b86a098f7"} Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.969125 4974 scope.go:117] "RemoveContainer" containerID="d744872441ac27f8ea33b79021be8dc48a9a772584c8985c6bc37c67006261c3" Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.988927 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"807d94ac-9bfc-465c-b60e-bb314831cdb1","Type":"ContainerStarted","Data":"69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e"} Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.989125 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="807d94ac-9bfc-465c-b60e-bb314831cdb1" containerName="glance-log" containerID="cri-o://8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53" gracePeriod=30 Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.989809 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="807d94ac-9bfc-465c-b60e-bb314831cdb1" containerName="glance-httpd" containerID="cri-o://69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e" gracePeriod=30 Oct 13 18:32:17 crc kubenswrapper[4974]: I1013 18:32:17.993774 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.043002 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff5909ff-629e-4b68-bac5-2627113a0809","Type":"ContainerStarted","Data":"7e5aa0cf734cdaf040b5985f2e56bea01dc2e36f17c9e62b4117395de738f59d"} Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.043163 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ff5909ff-629e-4b68-bac5-2627113a0809" containerName="glance-log" containerID="cri-o://f4433fa84e067b449392c0a1c7def6b9dfe6ecf86c9a283aafe7ceec46a8ec4a" gracePeriod=30 Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.043459 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="ff5909ff-629e-4b68-bac5-2627113a0809" containerName="glance-httpd" containerID="cri-o://7e5aa0cf734cdaf040b5985f2e56bea01dc2e36f17c9e62b4117395de738f59d" gracePeriod=30 Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.066820 4974 scope.go:117] "RemoveContainer" containerID="bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.120112 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.120094754 podStartE2EDuration="7.120094754s" podCreationTimestamp="2025-10-13 18:32:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:18.017166441 +0000 UTC m=+1072.921532521" watchObservedRunningTime="2025-10-13 18:32:18.120094754 +0000 UTC m=+1073.024460834" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.144582 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.147382 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.190844 4974 scope.go:117] "RemoveContainer" containerID="37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.197772 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.197754053 podStartE2EDuration="7.197754053s" podCreationTimestamp="2025-10-13 18:32:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:18.097506307 +0000 UTC m=+1073.001872387" watchObservedRunningTime="2025-10-13 18:32:18.197754053 +0000 UTC m=+1073.102120133" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.225745 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Oct 13 18:32:18 crc kubenswrapper[4974]: E1013 18:32:18.226174 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f47666-886f-422d-99b5-607d95d84774" containerName="horizon-log" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.226190 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f47666-886f-422d-99b5-607d95d84774" containerName="horizon-log" Oct 13 18:32:18 crc kubenswrapper[4974]: E1013 18:32:18.226201 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a480348-b0db-489e-be33-a93c1c6d311f" containerName="horizon-log" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.226207 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a480348-b0db-489e-be33-a93c1c6d311f" containerName="horizon-log" Oct 13 18:32:18 crc kubenswrapper[4974]: E1013 18:32:18.226218 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a480348-b0db-489e-be33-a93c1c6d311f" containerName="horizon" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.226224 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a480348-b0db-489e-be33-a93c1c6d311f" containerName="horizon" Oct 13 18:32:18 crc kubenswrapper[4974]: E1013 18:32:18.226236 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2289a84-2fbc-42b5-884a-22f9134e8e15" containerName="horizon" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.226242 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2289a84-2fbc-42b5-884a-22f9134e8e15" containerName="horizon" Oct 13 18:32:18 crc kubenswrapper[4974]: E1013 18:32:18.226258 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87f47666-886f-422d-99b5-607d95d84774" containerName="horizon" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.226265 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="87f47666-886f-422d-99b5-607d95d84774" containerName="horizon" Oct 13 18:32:18 crc kubenswrapper[4974]: E1013 18:32:18.226273 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" containerName="cinder-api-log" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.226279 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" containerName="cinder-api-log" Oct 13 18:32:18 crc kubenswrapper[4974]: E1013 18:32:18.226290 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" containerName="cinder-api" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.226297 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" containerName="cinder-api" Oct 13 18:32:18 crc kubenswrapper[4974]: E1013 18:32:18.226309 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f9a6fc-6ac6-4269-b227-5daa5939a207" containerName="init" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.226314 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f9a6fc-6ac6-4269-b227-5daa5939a207" containerName="init" Oct 13 18:32:18 crc kubenswrapper[4974]: E1013 18:32:18.226336 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2289a84-2fbc-42b5-884a-22f9134e8e15" containerName="horizon-log" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.226341 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2289a84-2fbc-42b5-884a-22f9134e8e15" containerName="horizon-log" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.231810 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f47666-886f-422d-99b5-607d95d84774" containerName="horizon-log" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.231874 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a480348-b0db-489e-be33-a93c1c6d311f" containerName="horizon" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.231891 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" containerName="cinder-api-log" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.231927 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f9a6fc-6ac6-4269-b227-5daa5939a207" containerName="init" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.231948 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a480348-b0db-489e-be33-a93c1c6d311f" containerName="horizon-log" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.231958 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2289a84-2fbc-42b5-884a-22f9134e8e15" containerName="horizon-log" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.231994 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" containerName="cinder-api" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.232014 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2289a84-2fbc-42b5-884a-22f9134e8e15" containerName="horizon" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.232027 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="87f47666-886f-422d-99b5-607d95d84774" containerName="horizon" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.243672 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.245812 4974 scope.go:117] "RemoveContainer" containerID="bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4" Oct 13 18:32:18 crc kubenswrapper[4974]: E1013 18:32:18.246740 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4\": container with ID starting with bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4 not found: ID does not exist" containerID="bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.246770 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4"} err="failed to get container status \"bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4\": rpc error: code = NotFound desc = could not find container \"bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4\": container with ID starting with bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4 not found: ID does not exist" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.246795 4974 scope.go:117] "RemoveContainer" containerID="37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf" Oct 13 18:32:18 crc kubenswrapper[4974]: E1013 18:32:18.247522 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf\": container with ID starting with 37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf not found: ID does not exist" containerID="37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.247631 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf"} err="failed to get container status \"37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf\": rpc error: code = NotFound desc = could not find container \"37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf\": container with ID starting with 37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf not found: ID does not exist" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.247721 4974 scope.go:117] "RemoveContainer" containerID="bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.248022 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4"} err="failed to get container status \"bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4\": rpc error: code = NotFound desc = could not find container \"bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4\": container with ID starting with bf7d137c102f42339649d2e17e4daf28746aab211cdbdc7b9899e56440f8f1c4 not found: ID does not exist" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.248106 4974 scope.go:117] "RemoveContainer" containerID="37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.248392 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf"} err="failed to get container status \"37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf\": rpc error: code = NotFound desc = could not find container \"37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf\": container with ID starting with 37ca03fa4f280225e56a8e5a43c7da5808322ad6b8df5cd14e7d5482c488b8cf not found: ID does not exist" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.253476 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.253944 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.254197 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.278276 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.405200 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-logs\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.405246 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.405298 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-config-data\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.405313 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.405346 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.405375 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-scripts\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.405618 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.405678 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.405794 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rvdt\" (UniqueName: \"kubernetes.io/projected/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-kube-api-access-7rvdt\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.417626 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.417684 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.418349 4974 scope.go:117] "RemoveContainer" containerID="e4ce9d5b57598d4e4ce005dd5f090268be817265c2f0db6be413ed342cbf8e56" Oct 13 18:32:18 crc kubenswrapper[4974]: E1013 18:32:18.418609 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(89790087-1d9c-4278-b62f-e18a94775048)\"" pod="openstack/watcher-decision-engine-0" podUID="89790087-1d9c-4278-b62f-e18a94775048" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.507073 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rvdt\" (UniqueName: \"kubernetes.io/projected/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-kube-api-access-7rvdt\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.507158 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-logs\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.507181 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.507232 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.507249 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-config-data\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.507264 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.507300 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-scripts\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.507377 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.507401 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.507477 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.507641 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-logs\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.513638 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-config-data\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.514412 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-config-data-custom\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.517163 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-scripts\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.517724 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.518296 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.526508 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rvdt\" (UniqueName: \"kubernetes.io/projected/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-kube-api-access-7rvdt\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.527006 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d89238-6f74-4da7-aa6d-1b6c5f56a204-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a4d89238-6f74-4da7-aa6d-1b6c5f56a204\") " pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.698305 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Oct 13 18:32:18 crc kubenswrapper[4974]: I1013 18:32:18.998503 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.083863 4974 generic.go:334] "Generic (PLEG): container finished" podID="807d94ac-9bfc-465c-b60e-bb314831cdb1" containerID="69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e" exitCode=0 Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.084281 4974 generic.go:334] "Generic (PLEG): container finished" podID="807d94ac-9bfc-465c-b60e-bb314831cdb1" containerID="8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53" exitCode=143 Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.084098 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.084121 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"807d94ac-9bfc-465c-b60e-bb314831cdb1","Type":"ContainerDied","Data":"69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e"} Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.084389 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"807d94ac-9bfc-465c-b60e-bb314831cdb1","Type":"ContainerDied","Data":"8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53"} Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.084470 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"807d94ac-9bfc-465c-b60e-bb314831cdb1","Type":"ContainerDied","Data":"ca0e2d958f28064928c2f1cf13276d24422f4a02543e3eaadc2d0a37a6cdf4d2"} Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.084488 4974 scope.go:117] "RemoveContainer" containerID="69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.093475 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e27cf425-7780-4d59-b887-ec5da0a83a71","Type":"ContainerStarted","Data":"ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91"} Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.093856 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.099232 4974 generic.go:334] "Generic (PLEG): container finished" podID="ff5909ff-629e-4b68-bac5-2627113a0809" containerID="7e5aa0cf734cdaf040b5985f2e56bea01dc2e36f17c9e62b4117395de738f59d" exitCode=0 Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.099273 4974 generic.go:334] "Generic (PLEG): container finished" podID="ff5909ff-629e-4b68-bac5-2627113a0809" containerID="f4433fa84e067b449392c0a1c7def6b9dfe6ecf86c9a283aafe7ceec46a8ec4a" exitCode=143 Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.099313 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff5909ff-629e-4b68-bac5-2627113a0809","Type":"ContainerDied","Data":"7e5aa0cf734cdaf040b5985f2e56bea01dc2e36f17c9e62b4117395de738f59d"} Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.099356 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff5909ff-629e-4b68-bac5-2627113a0809","Type":"ContainerDied","Data":"f4433fa84e067b449392c0a1c7def6b9dfe6ecf86c9a283aafe7ceec46a8ec4a"} Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.122438 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.4845712779999998 podStartE2EDuration="8.122418196s" podCreationTimestamp="2025-10-13 18:32:11 +0000 UTC" firstStartedPulling="2025-10-13 18:32:12.983858502 +0000 UTC m=+1067.888224582" lastFinishedPulling="2025-10-13 18:32:17.62170542 +0000 UTC m=+1072.526071500" observedRunningTime="2025-10-13 18:32:19.113192396 +0000 UTC m=+1074.017558476" watchObservedRunningTime="2025-10-13 18:32:19.122418196 +0000 UTC m=+1074.026784276" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.124254 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-scripts\") pod \"807d94ac-9bfc-465c-b60e-bb314831cdb1\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.124406 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/807d94ac-9bfc-465c-b60e-bb314831cdb1-logs\") pod \"807d94ac-9bfc-465c-b60e-bb314831cdb1\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.124438 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"807d94ac-9bfc-465c-b60e-bb314831cdb1\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.124473 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/807d94ac-9bfc-465c-b60e-bb314831cdb1-httpd-run\") pod \"807d94ac-9bfc-465c-b60e-bb314831cdb1\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.124518 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-config-data\") pod \"807d94ac-9bfc-465c-b60e-bb314831cdb1\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.124548 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxxsc\" (UniqueName: \"kubernetes.io/projected/807d94ac-9bfc-465c-b60e-bb314831cdb1-kube-api-access-lxxsc\") pod \"807d94ac-9bfc-465c-b60e-bb314831cdb1\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.124611 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-combined-ca-bundle\") pod \"807d94ac-9bfc-465c-b60e-bb314831cdb1\" (UID: \"807d94ac-9bfc-465c-b60e-bb314831cdb1\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.125893 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/807d94ac-9bfc-465c-b60e-bb314831cdb1-logs" (OuterVolumeSpecName: "logs") pod "807d94ac-9bfc-465c-b60e-bb314831cdb1" (UID: "807d94ac-9bfc-465c-b60e-bb314831cdb1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.126078 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/807d94ac-9bfc-465c-b60e-bb314831cdb1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "807d94ac-9bfc-465c-b60e-bb314831cdb1" (UID: "807d94ac-9bfc-465c-b60e-bb314831cdb1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.132185 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-scripts" (OuterVolumeSpecName: "scripts") pod "807d94ac-9bfc-465c-b60e-bb314831cdb1" (UID: "807d94ac-9bfc-465c-b60e-bb314831cdb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.132694 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/807d94ac-9bfc-465c-b60e-bb314831cdb1-kube-api-access-lxxsc" (OuterVolumeSpecName: "kube-api-access-lxxsc") pod "807d94ac-9bfc-465c-b60e-bb314831cdb1" (UID: "807d94ac-9bfc-465c-b60e-bb314831cdb1"). InnerVolumeSpecName "kube-api-access-lxxsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.135286 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "807d94ac-9bfc-465c-b60e-bb314831cdb1" (UID: "807d94ac-9bfc-465c-b60e-bb314831cdb1"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.162765 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "807d94ac-9bfc-465c-b60e-bb314831cdb1" (UID: "807d94ac-9bfc-465c-b60e-bb314831cdb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.188143 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-config-data" (OuterVolumeSpecName: "config-data") pod "807d94ac-9bfc-465c-b60e-bb314831cdb1" (UID: "807d94ac-9bfc-465c-b60e-bb314831cdb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.227047 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/807d94ac-9bfc-465c-b60e-bb314831cdb1-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.227092 4974 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.227106 4974 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/807d94ac-9bfc-465c-b60e-bb314831cdb1-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.227115 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.227125 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxxsc\" (UniqueName: \"kubernetes.io/projected/807d94ac-9bfc-465c-b60e-bb314831cdb1-kube-api-access-lxxsc\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.227135 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.227143 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/807d94ac-9bfc-465c-b60e-bb314831cdb1-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.250362 4974 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.263834 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.280391 4974 scope.go:117] "RemoveContainer" containerID="8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.284283 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Oct 13 18:32:19 crc kubenswrapper[4974]: W1013 18:32:19.287861 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda4d89238_6f74_4da7_aa6d_1b6c5f56a204.slice/crio-84c275aa13eea177d83069a8cac319373d3062534b29f4ef925b7edbb4d752f7 WatchSource:0}: Error finding container 84c275aa13eea177d83069a8cac319373d3062534b29f4ef925b7edbb4d752f7: Status 404 returned error can't find the container with id 84c275aa13eea177d83069a8cac319373d3062534b29f4ef925b7edbb4d752f7 Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.329253 4974 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.421756 4974 scope.go:117] "RemoveContainer" containerID="69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e" Oct 13 18:32:19 crc kubenswrapper[4974]: E1013 18:32:19.426198 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e\": container with ID starting with 69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e not found: ID does not exist" containerID="69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.426249 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e"} err="failed to get container status \"69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e\": rpc error: code = NotFound desc = could not find container \"69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e\": container with ID starting with 69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e not found: ID does not exist" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.426276 4974 scope.go:117] "RemoveContainer" containerID="8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53" Oct 13 18:32:19 crc kubenswrapper[4974]: E1013 18:32:19.426719 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53\": container with ID starting with 8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53 not found: ID does not exist" containerID="8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.426743 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53"} err="failed to get container status \"8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53\": rpc error: code = NotFound desc = could not find container \"8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53\": container with ID starting with 8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53 not found: ID does not exist" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.426765 4974 scope.go:117] "RemoveContainer" containerID="69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.427095 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e"} err="failed to get container status \"69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e\": rpc error: code = NotFound desc = could not find container \"69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e\": container with ID starting with 69aae269099dc264b8c27d60bee9a66871b83ec22184c0975b7c7ffd10ee239e not found: ID does not exist" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.427120 4974 scope.go:117] "RemoveContainer" containerID="8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.427401 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53"} err="failed to get container status \"8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53\": rpc error: code = NotFound desc = could not find container \"8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53\": container with ID starting with 8a038ee0d85fe261ed7b3d5e77dd8b62118536326c2d576856513562f387af53 not found: ID does not exist" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.428267 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.430101 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ff5909ff-629e-4b68-bac5-2627113a0809\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.430229 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-combined-ca-bundle\") pod \"ff5909ff-629e-4b68-bac5-2627113a0809\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.430264 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff5909ff-629e-4b68-bac5-2627113a0809-logs\") pod \"ff5909ff-629e-4b68-bac5-2627113a0809\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.430291 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-config-data\") pod \"ff5909ff-629e-4b68-bac5-2627113a0809\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.430537 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-scripts\") pod \"ff5909ff-629e-4b68-bac5-2627113a0809\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.430583 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff5909ff-629e-4b68-bac5-2627113a0809-httpd-run\") pod \"ff5909ff-629e-4b68-bac5-2627113a0809\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.430621 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p649n\" (UniqueName: \"kubernetes.io/projected/ff5909ff-629e-4b68-bac5-2627113a0809-kube-api-access-p649n\") pod \"ff5909ff-629e-4b68-bac5-2627113a0809\" (UID: \"ff5909ff-629e-4b68-bac5-2627113a0809\") " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.430803 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5909ff-629e-4b68-bac5-2627113a0809-logs" (OuterVolumeSpecName: "logs") pod "ff5909ff-629e-4b68-bac5-2627113a0809" (UID: "ff5909ff-629e-4b68-bac5-2627113a0809"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.431093 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ff5909ff-629e-4b68-bac5-2627113a0809-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.431240 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff5909ff-629e-4b68-bac5-2627113a0809-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ff5909ff-629e-4b68-bac5-2627113a0809" (UID: "ff5909ff-629e-4b68-bac5-2627113a0809"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.435152 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-scripts" (OuterVolumeSpecName: "scripts") pod "ff5909ff-629e-4b68-bac5-2627113a0809" (UID: "ff5909ff-629e-4b68-bac5-2627113a0809"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.442841 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "ff5909ff-629e-4b68-bac5-2627113a0809" (UID: "ff5909ff-629e-4b68-bac5-2627113a0809"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.443320 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff5909ff-629e-4b68-bac5-2627113a0809-kube-api-access-p649n" (OuterVolumeSpecName: "kube-api-access-p649n") pod "ff5909ff-629e-4b68-bac5-2627113a0809" (UID: "ff5909ff-629e-4b68-bac5-2627113a0809"). InnerVolumeSpecName "kube-api-access-p649n". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.445788 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.466340 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:19 crc kubenswrapper[4974]: E1013 18:32:19.466822 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807d94ac-9bfc-465c-b60e-bb314831cdb1" containerName="glance-httpd" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.466836 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="807d94ac-9bfc-465c-b60e-bb314831cdb1" containerName="glance-httpd" Oct 13 18:32:19 crc kubenswrapper[4974]: E1013 18:32:19.466854 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="807d94ac-9bfc-465c-b60e-bb314831cdb1" containerName="glance-log" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.466866 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="807d94ac-9bfc-465c-b60e-bb314831cdb1" containerName="glance-log" Oct 13 18:32:19 crc kubenswrapper[4974]: E1013 18:32:19.466888 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5909ff-629e-4b68-bac5-2627113a0809" containerName="glance-log" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.466896 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5909ff-629e-4b68-bac5-2627113a0809" containerName="glance-log" Oct 13 18:32:19 crc kubenswrapper[4974]: E1013 18:32:19.466917 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff5909ff-629e-4b68-bac5-2627113a0809" containerName="glance-httpd" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.466922 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff5909ff-629e-4b68-bac5-2627113a0809" containerName="glance-httpd" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.467133 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="807d94ac-9bfc-465c-b60e-bb314831cdb1" containerName="glance-log" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.467188 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="807d94ac-9bfc-465c-b60e-bb314831cdb1" containerName="glance-httpd" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.467201 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5909ff-629e-4b68-bac5-2627113a0809" containerName="glance-log" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.467217 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff5909ff-629e-4b68-bac5-2627113a0809" containerName="glance-httpd" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.468374 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.473876 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.476329 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.477085 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.489853 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff5909ff-629e-4b68-bac5-2627113a0809" (UID: "ff5909ff-629e-4b68-bac5-2627113a0809"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.532717 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.532746 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.532755 4974 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ff5909ff-629e-4b68-bac5-2627113a0809-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.532766 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p649n\" (UniqueName: \"kubernetes.io/projected/ff5909ff-629e-4b68-bac5-2627113a0809-kube-api-access-p649n\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.532797 4974 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.544934 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-config-data" (OuterVolumeSpecName: "config-data") pod "ff5909ff-629e-4b68-bac5-2627113a0809" (UID: "ff5909ff-629e-4b68-bac5-2627113a0809"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.566964 4974 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.634324 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7d88\" (UniqueName: \"kubernetes.io/projected/0155adb4-e317-4927-913e-acd03779ad3f-kube-api-access-r7d88\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.634395 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0155adb4-e317-4927-913e-acd03779ad3f-logs\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.634416 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0155adb4-e317-4927-913e-acd03779ad3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.634440 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.634460 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.634477 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.634529 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.634580 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.634768 4974 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.634791 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff5909ff-629e-4b68-bac5-2627113a0809-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.736753 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7d88\" (UniqueName: \"kubernetes.io/projected/0155adb4-e317-4927-913e-acd03779ad3f-kube-api-access-r7d88\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.736833 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0155adb4-e317-4927-913e-acd03779ad3f-logs\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.736851 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0155adb4-e317-4927-913e-acd03779ad3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.736880 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.736901 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.736918 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.736948 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.737000 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.738045 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0155adb4-e317-4927-913e-acd03779ad3f-logs\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.738267 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0155adb4-e317-4927-913e-acd03779ad3f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.740098 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.744068 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-scripts\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.745160 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.753788 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.756148 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7d88\" (UniqueName: \"kubernetes.io/projected/0155adb4-e317-4927-913e-acd03779ad3f-kube-api-access-r7d88\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.757624 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-config-data\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.772970 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.819038 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.865991 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="807d94ac-9bfc-465c-b60e-bb314831cdb1" path="/var/lib/kubelet/pods/807d94ac-9bfc-465c-b60e-bb314831cdb1/volumes" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.867145 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaf5ac8c-72ac-4321-817d-bb4a7bcc28de" path="/var/lib/kubelet/pods/eaf5ac8c-72ac-4321-817d-bb4a7bcc28de/volumes" Oct 13 18:32:19 crc kubenswrapper[4974]: I1013 18:32:19.908938 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.042415 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-ovndb-tls-certs\") pod \"dec721b6-7daa-4481-8e76-df9054f32f97\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.042499 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hd5fz\" (UniqueName: \"kubernetes.io/projected/dec721b6-7daa-4481-8e76-df9054f32f97-kube-api-access-hd5fz\") pod \"dec721b6-7daa-4481-8e76-df9054f32f97\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.042545 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-httpd-config\") pod \"dec721b6-7daa-4481-8e76-df9054f32f97\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.042561 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-config\") pod \"dec721b6-7daa-4481-8e76-df9054f32f97\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.042695 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-combined-ca-bundle\") pod \"dec721b6-7daa-4481-8e76-df9054f32f97\" (UID: \"dec721b6-7daa-4481-8e76-df9054f32f97\") " Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.048504 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec721b6-7daa-4481-8e76-df9054f32f97-kube-api-access-hd5fz" (OuterVolumeSpecName: "kube-api-access-hd5fz") pod "dec721b6-7daa-4481-8e76-df9054f32f97" (UID: "dec721b6-7daa-4481-8e76-df9054f32f97"). InnerVolumeSpecName "kube-api-access-hd5fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.053375 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "dec721b6-7daa-4481-8e76-df9054f32f97" (UID: "dec721b6-7daa-4481-8e76-df9054f32f97"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.102599 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-config" (OuterVolumeSpecName: "config") pod "dec721b6-7daa-4481-8e76-df9054f32f97" (UID: "dec721b6-7daa-4481-8e76-df9054f32f97"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.112959 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dec721b6-7daa-4481-8e76-df9054f32f97" (UID: "dec721b6-7daa-4481-8e76-df9054f32f97"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.128628 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"ff5909ff-629e-4b68-bac5-2627113a0809","Type":"ContainerDied","Data":"b42975a52ef37e2dfc038e0301c62c934affb90e431f45aba850a60aea7dc570"} Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.128740 4974 scope.go:117] "RemoveContainer" containerID="7e5aa0cf734cdaf040b5985f2e56bea01dc2e36f17c9e62b4117395de738f59d" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.128849 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.134111 4974 generic.go:334] "Generic (PLEG): container finished" podID="dec721b6-7daa-4481-8e76-df9054f32f97" containerID="0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25" exitCode=0 Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.134163 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-579fddb58d-n5xbk" event={"ID":"dec721b6-7daa-4481-8e76-df9054f32f97","Type":"ContainerDied","Data":"0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25"} Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.134187 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-579fddb58d-n5xbk" event={"ID":"dec721b6-7daa-4481-8e76-df9054f32f97","Type":"ContainerDied","Data":"ac14dfe3d673da1c43de609033de25c209c82cf1ade6ae877d97d64706cd34ab"} Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.134236 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-579fddb58d-n5xbk" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.136636 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4d89238-6f74-4da7-aa6d-1b6c5f56a204","Type":"ContainerStarted","Data":"34cdbdfde66e339ca3f22f7c17141e5814e983ca92b8f4120076774def55542e"} Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.136680 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4d89238-6f74-4da7-aa6d-1b6c5f56a204","Type":"ContainerStarted","Data":"84c275aa13eea177d83069a8cac319373d3062534b29f4ef925b7edbb4d752f7"} Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.145775 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hd5fz\" (UniqueName: \"kubernetes.io/projected/dec721b6-7daa-4481-8e76-df9054f32f97-kube-api-access-hd5fz\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.145808 4974 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-httpd-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.145818 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.145827 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.154692 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "dec721b6-7daa-4481-8e76-df9054f32f97" (UID: "dec721b6-7daa-4481-8e76-df9054f32f97"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.247680 4974 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/dec721b6-7daa-4481-8e76-df9054f32f97-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.272333 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.275905 4974 scope.go:117] "RemoveContainer" containerID="f4433fa84e067b449392c0a1c7def6b9dfe6ecf86c9a283aafe7ceec46a8ec4a" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.291091 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.307127 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:20 crc kubenswrapper[4974]: E1013 18:32:20.309907 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec721b6-7daa-4481-8e76-df9054f32f97" containerName="neutron-api" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.309933 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec721b6-7daa-4481-8e76-df9054f32f97" containerName="neutron-api" Oct 13 18:32:20 crc kubenswrapper[4974]: E1013 18:32:20.309960 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec721b6-7daa-4481-8e76-df9054f32f97" containerName="neutron-httpd" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.309967 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec721b6-7daa-4481-8e76-df9054f32f97" containerName="neutron-httpd" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.310167 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec721b6-7daa-4481-8e76-df9054f32f97" containerName="neutron-httpd" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.310198 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec721b6-7daa-4481-8e76-df9054f32f97" containerName="neutron-api" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.310835 4974 scope.go:117] "RemoveContainer" containerID="0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.311323 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.316842 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.320087 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.320285 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.370125 4974 scope.go:117] "RemoveContainer" containerID="0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.397583 4974 scope.go:117] "RemoveContainer" containerID="0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548" Oct 13 18:32:20 crc kubenswrapper[4974]: E1013 18:32:20.399897 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548\": container with ID starting with 0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548 not found: ID does not exist" containerID="0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.399934 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548"} err="failed to get container status \"0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548\": rpc error: code = NotFound desc = could not find container \"0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548\": container with ID starting with 0fb7c14a9a0a3f9e5f1d999b53a31b6d420b1245260b15b8e44359c5a8391548 not found: ID does not exist" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.399975 4974 scope.go:117] "RemoveContainer" containerID="0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25" Oct 13 18:32:20 crc kubenswrapper[4974]: E1013 18:32:20.400920 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25\": container with ID starting with 0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25 not found: ID does not exist" containerID="0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.400961 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25"} err="failed to get container status \"0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25\": rpc error: code = NotFound desc = could not find container \"0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25\": container with ID starting with 0272dd6ddac5bc9ec50be3cd83e176a9204176dcc88cb2cea01025de5ff93a25 not found: ID does not exist" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.436296 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.450022 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjb6h\" (UniqueName: \"kubernetes.io/projected/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-kube-api-access-xjb6h\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.450073 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-config-data\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.450224 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-logs\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.450254 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.450287 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.450316 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.450382 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-scripts\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.450421 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.511061 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-579fddb58d-n5xbk"] Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.518274 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-579fddb58d-n5xbk"] Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.551929 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.551967 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.552022 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-scripts\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.552049 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.552093 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjb6h\" (UniqueName: \"kubernetes.io/projected/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-kube-api-access-xjb6h\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.552118 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-config-data\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.552199 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-logs\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.552217 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.552706 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.553503 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.554302 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-logs\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.558355 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-scripts\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.558444 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.560819 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-config-data\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.575005 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjb6h\" (UniqueName: \"kubernetes.io/projected/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-kube-api-access-xjb6h\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.579134 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.585996 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.637098 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:20 crc kubenswrapper[4974]: I1013 18:32:20.909241 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 13 18:32:21 crc kubenswrapper[4974]: I1013 18:32:21.105685 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 13 18:32:21 crc kubenswrapper[4974]: I1013 18:32:21.154462 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0155adb4-e317-4927-913e-acd03779ad3f","Type":"ContainerStarted","Data":"6c85c3841126f94ae7a20b776a5878ca50b1c42baf21f49507a2384c5f5cab44"} Oct 13 18:32:21 crc kubenswrapper[4974]: I1013 18:32:21.154763 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0155adb4-e317-4927-913e-acd03779ad3f","Type":"ContainerStarted","Data":"34de5f8990d443d98bb8edc72eb36cf700ebfd21a276a09306c64a3fcffcc5e5"} Oct 13 18:32:21 crc kubenswrapper[4974]: I1013 18:32:21.160204 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a4d89238-6f74-4da7-aa6d-1b6c5f56a204","Type":"ContainerStarted","Data":"925512cc798e4fac72ec68e3be6c780226b98b08336e6d4e263eccf7ab8a71f6"} Oct 13 18:32:21 crc kubenswrapper[4974]: I1013 18:32:21.160237 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Oct 13 18:32:21 crc kubenswrapper[4974]: I1013 18:32:21.238433 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 18:32:21 crc kubenswrapper[4974]: I1013 18:32:21.268699 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.268678907 podStartE2EDuration="3.268678907s" podCreationTimestamp="2025-10-13 18:32:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:21.243542618 +0000 UTC m=+1076.147908698" watchObservedRunningTime="2025-10-13 18:32:21.268678907 +0000 UTC m=+1076.173044987" Oct 13 18:32:21 crc kubenswrapper[4974]: I1013 18:32:21.301821 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:21 crc kubenswrapper[4974]: I1013 18:32:21.674464 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5796767b68-9dktc" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Oct 13 18:32:21 crc kubenswrapper[4974]: I1013 18:32:21.821613 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec721b6-7daa-4481-8e76-df9054f32f97" path="/var/lib/kubelet/pods/dec721b6-7daa-4481-8e76-df9054f32f97/volumes" Oct 13 18:32:21 crc kubenswrapper[4974]: I1013 18:32:21.822742 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff5909ff-629e-4b68-bac5-2627113a0809" path="/var/lib/kubelet/pods/ff5909ff-629e-4b68-bac5-2627113a0809/volumes" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.059324 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.146864 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c686c5b55-x94zv"] Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.147143 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" podUID="60c938e8-f8e0-4006-8124-6929b8945dbf" containerName="dnsmasq-dns" containerID="cri-o://f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694" gracePeriod=10 Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.221412 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef","Type":"ContainerStarted","Data":"8d528756029a83c4ee5b5f12ec17f1b17f8bf8b7eb37ef95586ad4678d508192"} Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.221802 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef","Type":"ContainerStarted","Data":"84cb0d9ee8808f06b9db022d7e0f557179b8429e48ee7593f67deabc2633735a"} Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.231071 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="687bd2ca-d010-478f-902c-76dfb378ec55" containerName="cinder-scheduler" containerID="cri-o://64ee487e15500b3439edd2816fbc3d5e29ff09a6af814ce45e9ceb0108ea31b2" gracePeriod=30 Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.232321 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0155adb4-e317-4927-913e-acd03779ad3f","Type":"ContainerStarted","Data":"b686047b632d9fec8762385ed496d43fcc0f784ad09f4078123fe30ee3fbaa82"} Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.233224 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="687bd2ca-d010-478f-902c-76dfb378ec55" containerName="probe" containerID="cri-o://4766525e68745961e7c2bbd9b68c665918efcdd1f0a79162423edca436494429" gracePeriod=30 Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.260173 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.260156674 podStartE2EDuration="3.260156674s" podCreationTimestamp="2025-10-13 18:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:22.25716837 +0000 UTC m=+1077.161534470" watchObservedRunningTime="2025-10-13 18:32:22.260156674 +0000 UTC m=+1077.164522754" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.370684 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.382626 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.762916 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.818203 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-ovsdbserver-nb\") pod \"60c938e8-f8e0-4006-8124-6929b8945dbf\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.818285 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-ovsdbserver-sb\") pod \"60c938e8-f8e0-4006-8124-6929b8945dbf\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.818350 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-dns-swift-storage-0\") pod \"60c938e8-f8e0-4006-8124-6929b8945dbf\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.818411 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2j74\" (UniqueName: \"kubernetes.io/projected/60c938e8-f8e0-4006-8124-6929b8945dbf-kube-api-access-t2j74\") pod \"60c938e8-f8e0-4006-8124-6929b8945dbf\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.818471 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-config\") pod \"60c938e8-f8e0-4006-8124-6929b8945dbf\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.818496 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-dns-svc\") pod \"60c938e8-f8e0-4006-8124-6929b8945dbf\" (UID: \"60c938e8-f8e0-4006-8124-6929b8945dbf\") " Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.825005 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60c938e8-f8e0-4006-8124-6929b8945dbf-kube-api-access-t2j74" (OuterVolumeSpecName: "kube-api-access-t2j74") pod "60c938e8-f8e0-4006-8124-6929b8945dbf" (UID: "60c938e8-f8e0-4006-8124-6929b8945dbf"). InnerVolumeSpecName "kube-api-access-t2j74". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.878742 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "60c938e8-f8e0-4006-8124-6929b8945dbf" (UID: "60c938e8-f8e0-4006-8124-6929b8945dbf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.880822 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "60c938e8-f8e0-4006-8124-6929b8945dbf" (UID: "60c938e8-f8e0-4006-8124-6929b8945dbf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.906223 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-config" (OuterVolumeSpecName: "config") pod "60c938e8-f8e0-4006-8124-6929b8945dbf" (UID: "60c938e8-f8e0-4006-8124-6929b8945dbf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.920929 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.920954 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.920968 4974 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.920977 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2j74\" (UniqueName: \"kubernetes.io/projected/60c938e8-f8e0-4006-8124-6929b8945dbf-kube-api-access-t2j74\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.921610 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "60c938e8-f8e0-4006-8124-6929b8945dbf" (UID: "60c938e8-f8e0-4006-8124-6929b8945dbf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:22 crc kubenswrapper[4974]: I1013 18:32:22.924167 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "60c938e8-f8e0-4006-8124-6929b8945dbf" (UID: "60c938e8-f8e0-4006-8124-6929b8945dbf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.022548 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.022753 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/60c938e8-f8e0-4006-8124-6929b8945dbf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.241090 4974 generic.go:334] "Generic (PLEG): container finished" podID="60c938e8-f8e0-4006-8124-6929b8945dbf" containerID="f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694" exitCode=0 Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.241142 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" event={"ID":"60c938e8-f8e0-4006-8124-6929b8945dbf","Type":"ContainerDied","Data":"f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694"} Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.241169 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" event={"ID":"60c938e8-f8e0-4006-8124-6929b8945dbf","Type":"ContainerDied","Data":"23cf2bcc56f8eed0a0309a87aa287f4781dc9840b33d8e02a4b92d5ac319f8c2"} Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.241185 4974 scope.go:117] "RemoveContainer" containerID="f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.241292 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c686c5b55-x94zv" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.250368 4974 generic.go:334] "Generic (PLEG): container finished" podID="687bd2ca-d010-478f-902c-76dfb378ec55" containerID="4766525e68745961e7c2bbd9b68c665918efcdd1f0a79162423edca436494429" exitCode=0 Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.250399 4974 generic.go:334] "Generic (PLEG): container finished" podID="687bd2ca-d010-478f-902c-76dfb378ec55" containerID="64ee487e15500b3439edd2816fbc3d5e29ff09a6af814ce45e9ceb0108ea31b2" exitCode=0 Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.250463 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"687bd2ca-d010-478f-902c-76dfb378ec55","Type":"ContainerDied","Data":"4766525e68745961e7c2bbd9b68c665918efcdd1f0a79162423edca436494429"} Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.250544 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"687bd2ca-d010-478f-902c-76dfb378ec55","Type":"ContainerDied","Data":"64ee487e15500b3439edd2816fbc3d5e29ff09a6af814ce45e9ceb0108ea31b2"} Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.260617 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef","Type":"ContainerStarted","Data":"7cfa5f9549b149d2372038f61a90b9cb84b54f65fbc44fc06b9a3e6d8068dabc"} Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.278854 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c686c5b55-x94zv"] Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.280888 4974 scope.go:117] "RemoveContainer" containerID="b7a68067e9d66fe4cbe9189cfc06138f4c33495c46074ad6cb9688d6e26788ec" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.288747 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c686c5b55-x94zv"] Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.316144 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.31612196 podStartE2EDuration="3.31612196s" podCreationTimestamp="2025-10-13 18:32:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:23.291098165 +0000 UTC m=+1078.195464245" watchObservedRunningTime="2025-10-13 18:32:23.31612196 +0000 UTC m=+1078.220488040" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.329101 4974 scope.go:117] "RemoveContainer" containerID="f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694" Oct 13 18:32:23 crc kubenswrapper[4974]: E1013 18:32:23.330890 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694\": container with ID starting with f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694 not found: ID does not exist" containerID="f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.330924 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694"} err="failed to get container status \"f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694\": rpc error: code = NotFound desc = could not find container \"f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694\": container with ID starting with f0948b2d7d59aa0b5aa9102b7de146a0e94930b30867d34f8ed89a9586e72694 not found: ID does not exist" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.330948 4974 scope.go:117] "RemoveContainer" containerID="b7a68067e9d66fe4cbe9189cfc06138f4c33495c46074ad6cb9688d6e26788ec" Oct 13 18:32:23 crc kubenswrapper[4974]: E1013 18:32:23.337766 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7a68067e9d66fe4cbe9189cfc06138f4c33495c46074ad6cb9688d6e26788ec\": container with ID starting with b7a68067e9d66fe4cbe9189cfc06138f4c33495c46074ad6cb9688d6e26788ec not found: ID does not exist" containerID="b7a68067e9d66fe4cbe9189cfc06138f4c33495c46074ad6cb9688d6e26788ec" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.337802 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a68067e9d66fe4cbe9189cfc06138f4c33495c46074ad6cb9688d6e26788ec"} err="failed to get container status \"b7a68067e9d66fe4cbe9189cfc06138f4c33495c46074ad6cb9688d6e26788ec\": rpc error: code = NotFound desc = could not find container \"b7a68067e9d66fe4cbe9189cfc06138f4c33495c46074ad6cb9688d6e26788ec\": container with ID starting with b7a68067e9d66fe4cbe9189cfc06138f4c33495c46074ad6cb9688d6e26788ec not found: ID does not exist" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.532876 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.633516 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mvwl\" (UniqueName: \"kubernetes.io/projected/687bd2ca-d010-478f-902c-76dfb378ec55-kube-api-access-4mvwl\") pod \"687bd2ca-d010-478f-902c-76dfb378ec55\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.634495 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-config-data-custom\") pod \"687bd2ca-d010-478f-902c-76dfb378ec55\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.634628 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-combined-ca-bundle\") pod \"687bd2ca-d010-478f-902c-76dfb378ec55\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.634684 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/687bd2ca-d010-478f-902c-76dfb378ec55-etc-machine-id\") pod \"687bd2ca-d010-478f-902c-76dfb378ec55\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.634733 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-scripts\") pod \"687bd2ca-d010-478f-902c-76dfb378ec55\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.634758 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-config-data\") pod \"687bd2ca-d010-478f-902c-76dfb378ec55\" (UID: \"687bd2ca-d010-478f-902c-76dfb378ec55\") " Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.634799 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/687bd2ca-d010-478f-902c-76dfb378ec55-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "687bd2ca-d010-478f-902c-76dfb378ec55" (UID: "687bd2ca-d010-478f-902c-76dfb378ec55"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.635088 4974 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/687bd2ca-d010-478f-902c-76dfb378ec55-etc-machine-id\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.643815 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/687bd2ca-d010-478f-902c-76dfb378ec55-kube-api-access-4mvwl" (OuterVolumeSpecName: "kube-api-access-4mvwl") pod "687bd2ca-d010-478f-902c-76dfb378ec55" (UID: "687bd2ca-d010-478f-902c-76dfb378ec55"). InnerVolumeSpecName "kube-api-access-4mvwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.644801 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-scripts" (OuterVolumeSpecName: "scripts") pod "687bd2ca-d010-478f-902c-76dfb378ec55" (UID: "687bd2ca-d010-478f-902c-76dfb378ec55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.644948 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "687bd2ca-d010-478f-902c-76dfb378ec55" (UID: "687bd2ca-d010-478f-902c-76dfb378ec55"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.701710 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "687bd2ca-d010-478f-902c-76dfb378ec55" (UID: "687bd2ca-d010-478f-902c-76dfb378ec55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.736958 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mvwl\" (UniqueName: \"kubernetes.io/projected/687bd2ca-d010-478f-902c-76dfb378ec55-kube-api-access-4mvwl\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.736996 4974 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-config-data-custom\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.737006 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.737015 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.739973 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-config-data" (OuterVolumeSpecName: "config-data") pod "687bd2ca-d010-478f-902c-76dfb378ec55" (UID: "687bd2ca-d010-478f-902c-76dfb378ec55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.823270 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60c938e8-f8e0-4006-8124-6929b8945dbf" path="/var/lib/kubelet/pods/60c938e8-f8e0-4006-8124-6929b8945dbf/volumes" Oct 13 18:32:23 crc kubenswrapper[4974]: I1013 18:32:23.839798 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/687bd2ca-d010-478f-902c-76dfb378ec55-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.273032 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"687bd2ca-d010-478f-902c-76dfb378ec55","Type":"ContainerDied","Data":"617f5958a0ac4e47139e383188014a02b215c8c17f45b8c01441f17ed8fb3991"} Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.273079 4974 scope.go:117] "RemoveContainer" containerID="4766525e68745961e7c2bbd9b68c665918efcdd1f0a79162423edca436494429" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.273168 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.305895 4974 scope.go:117] "RemoveContainer" containerID="64ee487e15500b3439edd2816fbc3d5e29ff09a6af814ce45e9ceb0108ea31b2" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.307754 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.315467 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.351371 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 18:32:24 crc kubenswrapper[4974]: E1013 18:32:24.351740 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c938e8-f8e0-4006-8124-6929b8945dbf" containerName="init" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.351755 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c938e8-f8e0-4006-8124-6929b8945dbf" containerName="init" Oct 13 18:32:24 crc kubenswrapper[4974]: E1013 18:32:24.351772 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687bd2ca-d010-478f-902c-76dfb378ec55" containerName="cinder-scheduler" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.351779 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="687bd2ca-d010-478f-902c-76dfb378ec55" containerName="cinder-scheduler" Oct 13 18:32:24 crc kubenswrapper[4974]: E1013 18:32:24.351807 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60c938e8-f8e0-4006-8124-6929b8945dbf" containerName="dnsmasq-dns" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.351812 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="60c938e8-f8e0-4006-8124-6929b8945dbf" containerName="dnsmasq-dns" Oct 13 18:32:24 crc kubenswrapper[4974]: E1013 18:32:24.351824 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="687bd2ca-d010-478f-902c-76dfb378ec55" containerName="probe" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.351829 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="687bd2ca-d010-478f-902c-76dfb378ec55" containerName="probe" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.352012 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="687bd2ca-d010-478f-902c-76dfb378ec55" containerName="probe" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.352029 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="687bd2ca-d010-478f-902c-76dfb378ec55" containerName="cinder-scheduler" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.352044 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="60c938e8-f8e0-4006-8124-6929b8945dbf" containerName="dnsmasq-dns" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.352973 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.366676 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.411002 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.525792 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.551079 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-scripts\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.551171 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.551328 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.551504 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-config-data\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.551550 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.551790 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7thr\" (UniqueName: \"kubernetes.io/projected/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-kube-api-access-q7thr\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.571206 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6b66599996-gvfwf" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.654187 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-config-data\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.654235 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.654336 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7thr\" (UniqueName: \"kubernetes.io/projected/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-kube-api-access-q7thr\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.654386 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-scripts\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.654431 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.654452 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.658451 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.661762 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-config-data\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.668061 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.670991 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.683136 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-scripts\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.684535 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7thr\" (UniqueName: \"kubernetes.io/projected/8a1a4b28-15ae-4fa5-8741-2d34b1062eee-kube-api-access-q7thr\") pod \"cinder-scheduler-0\" (UID: \"8a1a4b28-15ae-4fa5-8741-2d34b1062eee\") " pod="openstack/cinder-scheduler-0" Oct 13 18:32:24 crc kubenswrapper[4974]: I1013 18:32:24.967602 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Oct 13 18:32:25 crc kubenswrapper[4974]: I1013 18:32:25.431282 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Oct 13 18:32:25 crc kubenswrapper[4974]: W1013 18:32:25.441766 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a1a4b28_15ae_4fa5_8741_2d34b1062eee.slice/crio-698a1e808735cee0eb0497ecf39aefcc3e5d4c1c7688fdb05c6cb25188ae75c7 WatchSource:0}: Error finding container 698a1e808735cee0eb0497ecf39aefcc3e5d4c1c7688fdb05c6cb25188ae75c7: Status 404 returned error can't find the container with id 698a1e808735cee0eb0497ecf39aefcc3e5d4c1c7688fdb05c6cb25188ae75c7 Oct 13 18:32:25 crc kubenswrapper[4974]: I1013 18:32:25.537556 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-6f98cf4cc8-pgzsc" Oct 13 18:32:25 crc kubenswrapper[4974]: I1013 18:32:25.833046 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="687bd2ca-d010-478f-902c-76dfb378ec55" path="/var/lib/kubelet/pods/687bd2ca-d010-478f-902c-76dfb378ec55/volumes" Oct 13 18:32:26 crc kubenswrapper[4974]: I1013 18:32:26.298806 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8a1a4b28-15ae-4fa5-8741-2d34b1062eee","Type":"ContainerStarted","Data":"116989c6dfa22e5b22f5cdbd9bee8e3bcde40940e5f36d8467d950e3b9c65bde"} Oct 13 18:32:26 crc kubenswrapper[4974]: I1013 18:32:26.299125 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8a1a4b28-15ae-4fa5-8741-2d34b1062eee","Type":"ContainerStarted","Data":"698a1e808735cee0eb0497ecf39aefcc3e5d4c1c7688fdb05c6cb25188ae75c7"} Oct 13 18:32:27 crc kubenswrapper[4974]: I1013 18:32:27.314378 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"8a1a4b28-15ae-4fa5-8741-2d34b1062eee","Type":"ContainerStarted","Data":"dfc1604372ac7770775f835492fc1e344fe9fd560b0a7ebe91045c77d04a057f"} Oct 13 18:32:27 crc kubenswrapper[4974]: I1013 18:32:27.343190 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.343166745 podStartE2EDuration="3.343166745s" podCreationTimestamp="2025-10-13 18:32:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:27.334190851 +0000 UTC m=+1082.238556971" watchObservedRunningTime="2025-10-13 18:32:27.343166745 +0000 UTC m=+1082.247532855" Oct 13 18:32:28 crc kubenswrapper[4974]: I1013 18:32:28.417334 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 13 18:32:28 crc kubenswrapper[4974]: I1013 18:32:28.417398 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 13 18:32:28 crc kubenswrapper[4974]: I1013 18:32:28.418207 4974 scope.go:117] "RemoveContainer" containerID="e4ce9d5b57598d4e4ce005dd5f090268be817265c2f0db6be413ed342cbf8e56" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.159832 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5b5b857db9-xmtpd"] Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.162135 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.164982 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.165033 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.173495 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.182235 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b5b857db9-xmtpd"] Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.342985 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-public-tls-certs\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.343042 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-config-data\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.343067 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-etc-swift\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.343087 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-log-httpd\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.343121 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4drn\" (UniqueName: \"kubernetes.io/projected/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-kube-api-access-z4drn\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.343180 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-combined-ca-bundle\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.343207 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-internal-tls-certs\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.343224 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-run-httpd\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.350486 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"89790087-1d9c-4278-b62f-e18a94775048","Type":"ContainerStarted","Data":"98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d"} Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.445240 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-combined-ca-bundle\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.445323 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-internal-tls-certs\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.445346 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-run-httpd\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.445438 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-public-tls-certs\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.445458 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-config-data\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.445479 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-etc-swift\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.445497 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-log-httpd\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.445581 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4drn\" (UniqueName: \"kubernetes.io/projected/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-kube-api-access-z4drn\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.445888 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-run-httpd\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.446847 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-log-httpd\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.452241 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-internal-tls-certs\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.453467 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-combined-ca-bundle\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.455678 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-public-tls-certs\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.457548 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-config-data\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.460470 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-etc-swift\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.469565 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4drn\" (UniqueName: \"kubernetes.io/projected/835d1e2d-e4b2-47d5-89a2-ef955e650cc1-kube-api-access-z4drn\") pod \"swift-proxy-5b5b857db9-xmtpd\" (UID: \"835d1e2d-e4b2-47d5-89a2-ef955e650cc1\") " pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.482608 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.820384 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.820884 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.849123 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.864841 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.968289 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.987145 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.988293 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 18:32:29 crc kubenswrapper[4974]: I1013 18:32:29.989686 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:29.996504 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:29.998116 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:29.998443 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-nbxdb" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.117507 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b5b857db9-xmtpd"] Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.167829 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e672e2-a15c-4cfa-b751-c6208182f2c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e9e672e2-a15c-4cfa-b751-c6208182f2c7\") " pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.167885 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e9e672e2-a15c-4cfa-b751-c6208182f2c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"e9e672e2-a15c-4cfa-b751-c6208182f2c7\") " pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.167951 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t74bf\" (UniqueName: \"kubernetes.io/projected/e9e672e2-a15c-4cfa-b751-c6208182f2c7-kube-api-access-t74bf\") pod \"openstackclient\" (UID: \"e9e672e2-a15c-4cfa-b751-c6208182f2c7\") " pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.168114 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e9e672e2-a15c-4cfa-b751-c6208182f2c7-openstack-config\") pod \"openstackclient\" (UID: \"e9e672e2-a15c-4cfa-b751-c6208182f2c7\") " pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.269696 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e672e2-a15c-4cfa-b751-c6208182f2c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e9e672e2-a15c-4cfa-b751-c6208182f2c7\") " pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.269739 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e9e672e2-a15c-4cfa-b751-c6208182f2c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"e9e672e2-a15c-4cfa-b751-c6208182f2c7\") " pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.269794 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t74bf\" (UniqueName: \"kubernetes.io/projected/e9e672e2-a15c-4cfa-b751-c6208182f2c7-kube-api-access-t74bf\") pod \"openstackclient\" (UID: \"e9e672e2-a15c-4cfa-b751-c6208182f2c7\") " pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.269836 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e9e672e2-a15c-4cfa-b751-c6208182f2c7-openstack-config\") pod \"openstackclient\" (UID: \"e9e672e2-a15c-4cfa-b751-c6208182f2c7\") " pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.270727 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e9e672e2-a15c-4cfa-b751-c6208182f2c7-openstack-config\") pod \"openstackclient\" (UID: \"e9e672e2-a15c-4cfa-b751-c6208182f2c7\") " pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.275168 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e9e672e2-a15c-4cfa-b751-c6208182f2c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"e9e672e2-a15c-4cfa-b751-c6208182f2c7\") " pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.275807 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9e672e2-a15c-4cfa-b751-c6208182f2c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e9e672e2-a15c-4cfa-b751-c6208182f2c7\") " pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.289404 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t74bf\" (UniqueName: \"kubernetes.io/projected/e9e672e2-a15c-4cfa-b751-c6208182f2c7-kube-api-access-t74bf\") pod \"openstackclient\" (UID: \"e9e672e2-a15c-4cfa-b751-c6208182f2c7\") " pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.347705 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.363952 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b5b857db9-xmtpd" event={"ID":"835d1e2d-e4b2-47d5-89a2-ef955e650cc1","Type":"ContainerStarted","Data":"94ddaab392ff66dfb9a69a596f3e0cf0060e23a26869117526e253363862e95f"} Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.364022 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b5b857db9-xmtpd" event={"ID":"835d1e2d-e4b2-47d5-89a2-ef955e650cc1","Type":"ContainerStarted","Data":"248683c3c083eda3d57750d50c8a21c346d0118120821f159b2be3a9175fcbd0"} Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.364271 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.364292 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.640027 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.640319 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.692481 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.725401 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:30 crc kubenswrapper[4974]: I1013 18:32:30.879602 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.124571 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.389317 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"e9e672e2-a15c-4cfa-b751-c6208182f2c7","Type":"ContainerStarted","Data":"88e7e6f8f302e1b9737fd4e8efd098ad5a216322a3b088464f3882120d9efc36"} Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.391953 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b5b857db9-xmtpd" event={"ID":"835d1e2d-e4b2-47d5-89a2-ef955e650cc1","Type":"ContainerStarted","Data":"9ab9f605d2dc39a56369714704a678ce97cc6bdee2c509d1b700a9253b5c695f"} Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.392431 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.392452 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.416641 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5b5b857db9-xmtpd" podStartSLOduration=2.416623098 podStartE2EDuration="2.416623098s" podCreationTimestamp="2025-10-13 18:32:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:31.416039201 +0000 UTC m=+1086.320405281" watchObservedRunningTime="2025-10-13 18:32:31.416623098 +0000 UTC m=+1086.320989178" Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.542137 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.546962 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="proxy-httpd" containerID="cri-o://ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91" gracePeriod=30 Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.547079 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="sg-core" containerID="cri-o://c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e" gracePeriod=30 Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.542509 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="ceilometer-central-agent" containerID="cri-o://4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59" gracePeriod=30 Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.547125 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="ceilometer-notification-agent" containerID="cri-o://fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611" gracePeriod=30 Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.563431 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.185:3000/\": EOF" Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.674424 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5796767b68-9dktc" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Oct 13 18:32:31 crc kubenswrapper[4974]: I1013 18:32:31.674820 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:32:32 crc kubenswrapper[4974]: I1013 18:32:32.405021 4974 generic.go:334] "Generic (PLEG): container finished" podID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerID="ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91" exitCode=0 Oct 13 18:32:32 crc kubenswrapper[4974]: I1013 18:32:32.405058 4974 generic.go:334] "Generic (PLEG): container finished" podID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerID="c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e" exitCode=2 Oct 13 18:32:32 crc kubenswrapper[4974]: I1013 18:32:32.405067 4974 generic.go:334] "Generic (PLEG): container finished" podID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerID="4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59" exitCode=0 Oct 13 18:32:32 crc kubenswrapper[4974]: I1013 18:32:32.405568 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e27cf425-7780-4d59-b887-ec5da0a83a71","Type":"ContainerDied","Data":"ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91"} Oct 13 18:32:32 crc kubenswrapper[4974]: I1013 18:32:32.405624 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e27cf425-7780-4d59-b887-ec5da0a83a71","Type":"ContainerDied","Data":"c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e"} Oct 13 18:32:32 crc kubenswrapper[4974]: I1013 18:32:32.405637 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e27cf425-7780-4d59-b887-ec5da0a83a71","Type":"ContainerDied","Data":"4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59"} Oct 13 18:32:32 crc kubenswrapper[4974]: I1013 18:32:32.406084 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:32 crc kubenswrapper[4974]: I1013 18:32:32.406186 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:32 crc kubenswrapper[4974]: I1013 18:32:32.988115 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.032944 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e27cf425-7780-4d59-b887-ec5da0a83a71-run-httpd\") pod \"e27cf425-7780-4d59-b887-ec5da0a83a71\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.032996 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-combined-ca-bundle\") pod \"e27cf425-7780-4d59-b887-ec5da0a83a71\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.033125 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-sg-core-conf-yaml\") pod \"e27cf425-7780-4d59-b887-ec5da0a83a71\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.033231 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9pnt\" (UniqueName: \"kubernetes.io/projected/e27cf425-7780-4d59-b887-ec5da0a83a71-kube-api-access-p9pnt\") pod \"e27cf425-7780-4d59-b887-ec5da0a83a71\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.033257 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e27cf425-7780-4d59-b887-ec5da0a83a71-log-httpd\") pod \"e27cf425-7780-4d59-b887-ec5da0a83a71\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.033274 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e27cf425-7780-4d59-b887-ec5da0a83a71-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e27cf425-7780-4d59-b887-ec5da0a83a71" (UID: "e27cf425-7780-4d59-b887-ec5da0a83a71"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.033306 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-config-data\") pod \"e27cf425-7780-4d59-b887-ec5da0a83a71\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.033324 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-scripts\") pod \"e27cf425-7780-4d59-b887-ec5da0a83a71\" (UID: \"e27cf425-7780-4d59-b887-ec5da0a83a71\") " Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.033531 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e27cf425-7780-4d59-b887-ec5da0a83a71-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e27cf425-7780-4d59-b887-ec5da0a83a71" (UID: "e27cf425-7780-4d59-b887-ec5da0a83a71"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.033895 4974 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e27cf425-7780-4d59-b887-ec5da0a83a71-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.033913 4974 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e27cf425-7780-4d59-b887-ec5da0a83a71-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.038859 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e27cf425-7780-4d59-b887-ec5da0a83a71-kube-api-access-p9pnt" (OuterVolumeSpecName: "kube-api-access-p9pnt") pod "e27cf425-7780-4d59-b887-ec5da0a83a71" (UID: "e27cf425-7780-4d59-b887-ec5da0a83a71"). InnerVolumeSpecName "kube-api-access-p9pnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.066835 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-scripts" (OuterVolumeSpecName: "scripts") pod "e27cf425-7780-4d59-b887-ec5da0a83a71" (UID: "e27cf425-7780-4d59-b887-ec5da0a83a71"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.072803 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e27cf425-7780-4d59-b887-ec5da0a83a71" (UID: "e27cf425-7780-4d59-b887-ec5da0a83a71"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.136092 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9pnt\" (UniqueName: \"kubernetes.io/projected/e27cf425-7780-4d59-b887-ec5da0a83a71-kube-api-access-p9pnt\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.136123 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.136132 4974 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.142791 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e27cf425-7780-4d59-b887-ec5da0a83a71" (UID: "e27cf425-7780-4d59-b887-ec5da0a83a71"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.209972 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-config-data" (OuterVolumeSpecName: "config-data") pod "e27cf425-7780-4d59-b887-ec5da0a83a71" (UID: "e27cf425-7780-4d59-b887-ec5da0a83a71"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.238235 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.238268 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e27cf425-7780-4d59-b887-ec5da0a83a71-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.418090 4974 generic.go:334] "Generic (PLEG): container finished" podID="89790087-1d9c-4278-b62f-e18a94775048" containerID="98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d" exitCode=1 Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.418155 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"89790087-1d9c-4278-b62f-e18a94775048","Type":"ContainerDied","Data":"98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d"} Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.418189 4974 scope.go:117] "RemoveContainer" containerID="e4ce9d5b57598d4e4ce005dd5f090268be817265c2f0db6be413ed342cbf8e56" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.418805 4974 scope.go:117] "RemoveContainer" containerID="98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d" Oct 13 18:32:33 crc kubenswrapper[4974]: E1013 18:32:33.419117 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(89790087-1d9c-4278-b62f-e18a94775048)\"" pod="openstack/watcher-decision-engine-0" podUID="89790087-1d9c-4278-b62f-e18a94775048" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.423277 4974 generic.go:334] "Generic (PLEG): container finished" podID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerID="fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611" exitCode=0 Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.424380 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.425842 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e27cf425-7780-4d59-b887-ec5da0a83a71","Type":"ContainerDied","Data":"fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611"} Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.425896 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e27cf425-7780-4d59-b887-ec5da0a83a71","Type":"ContainerDied","Data":"df6851fa5f3e7c72b166f22308f279cef82b2096c77045bd2867b3e13b62608f"} Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.480761 4974 scope.go:117] "RemoveContainer" containerID="ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.510736 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.526316 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.529813 4974 scope.go:117] "RemoveContainer" containerID="c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.537991 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:33 crc kubenswrapper[4974]: E1013 18:32:33.538426 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="ceilometer-notification-agent" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.538442 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="ceilometer-notification-agent" Oct 13 18:32:33 crc kubenswrapper[4974]: E1013 18:32:33.538473 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="sg-core" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.538480 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="sg-core" Oct 13 18:32:33 crc kubenswrapper[4974]: E1013 18:32:33.538503 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="proxy-httpd" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.538509 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="proxy-httpd" Oct 13 18:32:33 crc kubenswrapper[4974]: E1013 18:32:33.538523 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="ceilometer-central-agent" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.538530 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="ceilometer-central-agent" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.538797 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="ceilometer-central-agent" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.538820 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="sg-core" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.538844 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="ceilometer-notification-agent" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.538855 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" containerName="proxy-httpd" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.540830 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.544389 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.545090 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.549190 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.588972 4974 scope.go:117] "RemoveContainer" containerID="fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.634929 4974 scope.go:117] "RemoveContainer" containerID="4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.635536 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.635842 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.659982 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8b64be-7ed7-498e-8c8c-696641cefac0-log-httpd\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.660063 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.660083 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.660132 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8b64be-7ed7-498e-8c8c-696641cefac0-run-httpd\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.660193 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfsqw\" (UniqueName: \"kubernetes.io/projected/de8b64be-7ed7-498e-8c8c-696641cefac0-kube-api-access-xfsqw\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.660221 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-config-data\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.660249 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-scripts\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.665570 4974 scope.go:117] "RemoveContainer" containerID="ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91" Oct 13 18:32:33 crc kubenswrapper[4974]: E1013 18:32:33.666306 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91\": container with ID starting with ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91 not found: ID does not exist" containerID="ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.666339 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91"} err="failed to get container status \"ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91\": rpc error: code = NotFound desc = could not find container \"ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91\": container with ID starting with ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91 not found: ID does not exist" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.666361 4974 scope.go:117] "RemoveContainer" containerID="c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e" Oct 13 18:32:33 crc kubenswrapper[4974]: E1013 18:32:33.666964 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e\": container with ID starting with c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e not found: ID does not exist" containerID="c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.666987 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e"} err="failed to get container status \"c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e\": rpc error: code = NotFound desc = could not find container \"c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e\": container with ID starting with c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e not found: ID does not exist" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.667000 4974 scope.go:117] "RemoveContainer" containerID="fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611" Oct 13 18:32:33 crc kubenswrapper[4974]: E1013 18:32:33.668730 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611\": container with ID starting with fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611 not found: ID does not exist" containerID="fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.668752 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611"} err="failed to get container status \"fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611\": rpc error: code = NotFound desc = could not find container \"fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611\": container with ID starting with fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611 not found: ID does not exist" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.668768 4974 scope.go:117] "RemoveContainer" containerID="4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59" Oct 13 18:32:33 crc kubenswrapper[4974]: E1013 18:32:33.672800 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59\": container with ID starting with 4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59 not found: ID does not exist" containerID="4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.672856 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59"} err="failed to get container status \"4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59\": rpc error: code = NotFound desc = could not find container \"4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59\": container with ID starting with 4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59 not found: ID does not exist" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.701442 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.761720 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8b64be-7ed7-498e-8c8c-696641cefac0-run-httpd\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.761813 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfsqw\" (UniqueName: \"kubernetes.io/projected/de8b64be-7ed7-498e-8c8c-696641cefac0-kube-api-access-xfsqw\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.761833 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-config-data\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.761868 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-scripts\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.761942 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8b64be-7ed7-498e-8c8c-696641cefac0-log-httpd\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.761976 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.761993 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.763103 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8b64be-7ed7-498e-8c8c-696641cefac0-run-httpd\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.764183 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8b64be-7ed7-498e-8c8c-696641cefac0-log-httpd\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.768296 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.768452 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-config-data\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.775140 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.776197 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-scripts\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.785946 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfsqw\" (UniqueName: \"kubernetes.io/projected/de8b64be-7ed7-498e-8c8c-696641cefac0-kube-api-access-xfsqw\") pod \"ceilometer-0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " pod="openstack/ceilometer-0" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.827060 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e27cf425-7780-4d59-b887-ec5da0a83a71" path="/var/lib/kubelet/pods/e27cf425-7780-4d59-b887-ec5da0a83a71/volumes" Oct 13 18:32:33 crc kubenswrapper[4974]: I1013 18:32:33.863170 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:34 crc kubenswrapper[4974]: I1013 18:32:34.173860 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:34 crc kubenswrapper[4974]: I1013 18:32:34.174246 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:32:34 crc kubenswrapper[4974]: I1013 18:32:34.414813 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:34 crc kubenswrapper[4974]: I1013 18:32:34.443037 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:35 crc kubenswrapper[4974]: I1013 18:32:35.182974 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Oct 13 18:32:35 crc kubenswrapper[4974]: I1013 18:32:35.527182 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8b64be-7ed7-498e-8c8c-696641cefac0","Type":"ContainerStarted","Data":"ad814f60b5e97c9cce0218748f54ad2a33efb248c11edb28fe8b60bf2e1a7732"} Oct 13 18:32:35 crc kubenswrapper[4974]: I1013 18:32:35.527598 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8b64be-7ed7-498e-8c8c-696641cefac0","Type":"ContainerStarted","Data":"9c34466a674d4714c78f4fc24a803a9dd0610b4bf6052834d361a342058d4785"} Oct 13 18:32:35 crc kubenswrapper[4974]: I1013 18:32:35.527613 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8b64be-7ed7-498e-8c8c-696641cefac0","Type":"ContainerStarted","Data":"3f8006df3b1e0a609f6493d47889bcf61e623a9b5e2adfd781469a1077aa598f"} Oct 13 18:32:36 crc kubenswrapper[4974]: I1013 18:32:36.540372 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8b64be-7ed7-498e-8c8c-696641cefac0","Type":"ContainerStarted","Data":"865f46c723fc4cf66dea891237fcca021a32994a5b824aa1990a0d70523060e1"} Oct 13 18:32:37 crc kubenswrapper[4974]: E1013 18:32:37.347576 4974 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode27cf425_7780_4d59_b887_ec5da0a83a71.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9b88447_1cdb_4666_a4c2_31b7a0e7192f.slice/crio-09cc0d62a371439bdad9579ae7fbafbb030801916febc2419d1d87c924749bdc.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode27cf425_7780_4d59_b887_ec5da0a83a71.slice/crio-4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode27cf425_7780_4d59_b887_ec5da0a83a71.slice/crio-c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode27cf425_7780_4d59_b887_ec5da0a83a71.slice/crio-conmon-c3cc0a72d2a8de5b67dd9c7d31eacd232d7c38920b446b7b66025d3588bf496e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89790087_1d9c_4278_b62f_e18a94775048.slice/crio-conmon-98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode27cf425_7780_4d59_b887_ec5da0a83a71.slice/crio-conmon-fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89790087_1d9c_4278_b62f_e18a94775048.slice/crio-98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode27cf425_7780_4d59_b887_ec5da0a83a71.slice/crio-fa6db954f4772193408b9ba4f66e3c2642daff2eb944d29a2b9bcac58dac1611.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode27cf425_7780_4d59_b887_ec5da0a83a71.slice/crio-conmon-4ed0e0393cffdcbbcec1e62a2c0b28349497f05713ec6dc78d58ecf20e973c59.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode27cf425_7780_4d59_b887_ec5da0a83a71.slice/crio-df6851fa5f3e7c72b166f22308f279cef82b2096c77045bd2867b3e13b62608f\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode27cf425_7780_4d59_b887_ec5da0a83a71.slice/crio-conmon-ee6c8a025e5edf3955a42052860290741fb5a1b2e5fdcc704b8984658cdd6f91.scope\": RecentStats: unable to find data in memory cache]" Oct 13 18:32:37 crc kubenswrapper[4974]: I1013 18:32:37.551541 4974 generic.go:334] "Generic (PLEG): container finished" podID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerID="09cc0d62a371439bdad9579ae7fbafbb030801916febc2419d1d87c924749bdc" exitCode=137 Oct 13 18:32:37 crc kubenswrapper[4974]: I1013 18:32:37.551638 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5796767b68-9dktc" event={"ID":"a9b88447-1cdb-4666-a4c2-31b7a0e7192f","Type":"ContainerDied","Data":"09cc0d62a371439bdad9579ae7fbafbb030801916febc2419d1d87c924749bdc"} Oct 13 18:32:37 crc kubenswrapper[4974]: I1013 18:32:37.555491 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8b64be-7ed7-498e-8c8c-696641cefac0","Type":"ContainerStarted","Data":"3ba800bfba56a3f6de20f2c2bbb8f095cd65f3a2be5bf2fc831729512d2b51c2"} Oct 13 18:32:37 crc kubenswrapper[4974]: I1013 18:32:37.556543 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 18:32:37 crc kubenswrapper[4974]: I1013 18:32:37.590242 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.23718343 podStartE2EDuration="4.59022254s" podCreationTimestamp="2025-10-13 18:32:33 +0000 UTC" firstStartedPulling="2025-10-13 18:32:34.491365929 +0000 UTC m=+1089.395732009" lastFinishedPulling="2025-10-13 18:32:36.844405019 +0000 UTC m=+1091.748771119" observedRunningTime="2025-10-13 18:32:37.582315867 +0000 UTC m=+1092.486681947" watchObservedRunningTime="2025-10-13 18:32:37.59022254 +0000 UTC m=+1092.494588620" Oct 13 18:32:38 crc kubenswrapper[4974]: I1013 18:32:38.417120 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 13 18:32:38 crc kubenswrapper[4974]: I1013 18:32:38.417501 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 13 18:32:38 crc kubenswrapper[4974]: I1013 18:32:38.418265 4974 scope.go:117] "RemoveContainer" containerID="98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d" Oct 13 18:32:38 crc kubenswrapper[4974]: E1013 18:32:38.418568 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(89790087-1d9c-4278-b62f-e18a94775048)\"" pod="openstack/watcher-decision-engine-0" podUID="89790087-1d9c-4278-b62f-e18a94775048" Oct 13 18:32:38 crc kubenswrapper[4974]: I1013 18:32:38.764817 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:39 crc kubenswrapper[4974]: I1013 18:32:39.316643 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:39 crc kubenswrapper[4974]: I1013 18:32:39.316926 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0155adb4-e317-4927-913e-acd03779ad3f" containerName="glance-log" containerID="cri-o://6c85c3841126f94ae7a20b776a5878ca50b1c42baf21f49507a2384c5f5cab44" gracePeriod=30 Oct 13 18:32:39 crc kubenswrapper[4974]: I1013 18:32:39.318411 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="0155adb4-e317-4927-913e-acd03779ad3f" containerName="glance-httpd" containerID="cri-o://b686047b632d9fec8762385ed496d43fcc0f784ad09f4078123fe30ee3fbaa82" gracePeriod=30 Oct 13 18:32:39 crc kubenswrapper[4974]: I1013 18:32:39.496834 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:39 crc kubenswrapper[4974]: I1013 18:32:39.499538 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b5b857db9-xmtpd" Oct 13 18:32:39 crc kubenswrapper[4974]: I1013 18:32:39.581241 4974 generic.go:334] "Generic (PLEG): container finished" podID="0155adb4-e317-4927-913e-acd03779ad3f" containerID="6c85c3841126f94ae7a20b776a5878ca50b1c42baf21f49507a2384c5f5cab44" exitCode=143 Oct 13 18:32:39 crc kubenswrapper[4974]: I1013 18:32:39.581606 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0155adb4-e317-4927-913e-acd03779ad3f","Type":"ContainerDied","Data":"6c85c3841126f94ae7a20b776a5878ca50b1c42baf21f49507a2384c5f5cab44"} Oct 13 18:32:40 crc kubenswrapper[4974]: I1013 18:32:40.441712 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:40 crc kubenswrapper[4974]: I1013 18:32:40.442245 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" containerName="glance-log" containerID="cri-o://8d528756029a83c4ee5b5f12ec17f1b17f8bf8b7eb37ef95586ad4678d508192" gracePeriod=30 Oct 13 18:32:40 crc kubenswrapper[4974]: I1013 18:32:40.442349 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" containerName="glance-httpd" containerID="cri-o://7cfa5f9549b149d2372038f61a90b9cb84b54f65fbc44fc06b9a3e6d8068dabc" gracePeriod=30 Oct 13 18:32:40 crc kubenswrapper[4974]: I1013 18:32:40.594252 4974 generic.go:334] "Generic (PLEG): container finished" podID="54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" containerID="8d528756029a83c4ee5b5f12ec17f1b17f8bf8b7eb37ef95586ad4678d508192" exitCode=143 Oct 13 18:32:40 crc kubenswrapper[4974]: I1013 18:32:40.594489 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef","Type":"ContainerDied","Data":"8d528756029a83c4ee5b5f12ec17f1b17f8bf8b7eb37ef95586ad4678d508192"} Oct 13 18:32:40 crc kubenswrapper[4974]: I1013 18:32:40.596475 4974 generic.go:334] "Generic (PLEG): container finished" podID="0155adb4-e317-4927-913e-acd03779ad3f" containerID="b686047b632d9fec8762385ed496d43fcc0f784ad09f4078123fe30ee3fbaa82" exitCode=0 Oct 13 18:32:40 crc kubenswrapper[4974]: I1013 18:32:40.596618 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0155adb4-e317-4927-913e-acd03779ad3f","Type":"ContainerDied","Data":"b686047b632d9fec8762385ed496d43fcc0f784ad09f4078123fe30ee3fbaa82"} Oct 13 18:32:40 crc kubenswrapper[4974]: I1013 18:32:40.596712 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="ceilometer-central-agent" containerID="cri-o://9c34466a674d4714c78f4fc24a803a9dd0610b4bf6052834d361a342058d4785" gracePeriod=30 Oct 13 18:32:40 crc kubenswrapper[4974]: I1013 18:32:40.596796 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="proxy-httpd" containerID="cri-o://3ba800bfba56a3f6de20f2c2bbb8f095cd65f3a2be5bf2fc831729512d2b51c2" gracePeriod=30 Oct 13 18:32:40 crc kubenswrapper[4974]: I1013 18:32:40.596843 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="sg-core" containerID="cri-o://865f46c723fc4cf66dea891237fcca021a32994a5b824aa1990a0d70523060e1" gracePeriod=30 Oct 13 18:32:40 crc kubenswrapper[4974]: I1013 18:32:40.596885 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="ceilometer-notification-agent" containerID="cri-o://ad814f60b5e97c9cce0218748f54ad2a33efb248c11edb28fe8b60bf2e1a7732" gracePeriod=30 Oct 13 18:32:41 crc kubenswrapper[4974]: I1013 18:32:41.612742 4974 generic.go:334] "Generic (PLEG): container finished" podID="54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" containerID="7cfa5f9549b149d2372038f61a90b9cb84b54f65fbc44fc06b9a3e6d8068dabc" exitCode=0 Oct 13 18:32:41 crc kubenswrapper[4974]: I1013 18:32:41.612809 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef","Type":"ContainerDied","Data":"7cfa5f9549b149d2372038f61a90b9cb84b54f65fbc44fc06b9a3e6d8068dabc"} Oct 13 18:32:41 crc kubenswrapper[4974]: I1013 18:32:41.615288 4974 generic.go:334] "Generic (PLEG): container finished" podID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerID="3ba800bfba56a3f6de20f2c2bbb8f095cd65f3a2be5bf2fc831729512d2b51c2" exitCode=0 Oct 13 18:32:41 crc kubenswrapper[4974]: I1013 18:32:41.615311 4974 generic.go:334] "Generic (PLEG): container finished" podID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerID="865f46c723fc4cf66dea891237fcca021a32994a5b824aa1990a0d70523060e1" exitCode=2 Oct 13 18:32:41 crc kubenswrapper[4974]: I1013 18:32:41.615319 4974 generic.go:334] "Generic (PLEG): container finished" podID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerID="ad814f60b5e97c9cce0218748f54ad2a33efb248c11edb28fe8b60bf2e1a7732" exitCode=0 Oct 13 18:32:41 crc kubenswrapper[4974]: I1013 18:32:41.615327 4974 generic.go:334] "Generic (PLEG): container finished" podID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerID="9c34466a674d4714c78f4fc24a803a9dd0610b4bf6052834d361a342058d4785" exitCode=0 Oct 13 18:32:41 crc kubenswrapper[4974]: I1013 18:32:41.615349 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8b64be-7ed7-498e-8c8c-696641cefac0","Type":"ContainerDied","Data":"3ba800bfba56a3f6de20f2c2bbb8f095cd65f3a2be5bf2fc831729512d2b51c2"} Oct 13 18:32:41 crc kubenswrapper[4974]: I1013 18:32:41.615377 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8b64be-7ed7-498e-8c8c-696641cefac0","Type":"ContainerDied","Data":"865f46c723fc4cf66dea891237fcca021a32994a5b824aa1990a0d70523060e1"} Oct 13 18:32:41 crc kubenswrapper[4974]: I1013 18:32:41.615393 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8b64be-7ed7-498e-8c8c-696641cefac0","Type":"ContainerDied","Data":"ad814f60b5e97c9cce0218748f54ad2a33efb248c11edb28fe8b60bf2e1a7732"} Oct 13 18:32:41 crc kubenswrapper[4974]: I1013 18:32:41.615404 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8b64be-7ed7-498e-8c8c-696641cefac0","Type":"ContainerDied","Data":"9c34466a674d4714c78f4fc24a803a9dd0610b4bf6052834d361a342058d4785"} Oct 13 18:32:41 crc kubenswrapper[4974]: I1013 18:32:41.675267 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5796767b68-9dktc" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.165:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.165:8443: connect: connection refused" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.594218 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.675748 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5796767b68-9dktc" event={"ID":"a9b88447-1cdb-4666-a4c2-31b7a0e7192f","Type":"ContainerDied","Data":"3a4cbb1020ef81130d0f39ec5b7a88db5ee0782152c9a294fd20fd56beba8a46"} Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.675923 4974 scope.go:117] "RemoveContainer" containerID="c71f94d566debc81c03ff7c3180f22e17d2ee5689e9730d35d2824e22263bd4e" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.675803 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5796767b68-9dktc" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.703078 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-logs\") pod \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.703117 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-horizon-secret-key\") pod \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.703151 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-combined-ca-bundle\") pod \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.703210 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-horizon-tls-certs\") pod \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.703236 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-scripts\") pod \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.703261 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-config-data\") pod \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.703344 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p7vvd\" (UniqueName: \"kubernetes.io/projected/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-kube-api-access-p7vvd\") pod \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\" (UID: \"a9b88447-1cdb-4666-a4c2-31b7a0e7192f\") " Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.705844 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-logs" (OuterVolumeSpecName: "logs") pod "a9b88447-1cdb-4666-a4c2-31b7a0e7192f" (UID: "a9b88447-1cdb-4666-a4c2-31b7a0e7192f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.709546 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "a9b88447-1cdb-4666-a4c2-31b7a0e7192f" (UID: "a9b88447-1cdb-4666-a4c2-31b7a0e7192f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.712946 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-kube-api-access-p7vvd" (OuterVolumeSpecName: "kube-api-access-p7vvd") pod "a9b88447-1cdb-4666-a4c2-31b7a0e7192f" (UID: "a9b88447-1cdb-4666-a4c2-31b7a0e7192f"). InnerVolumeSpecName "kube-api-access-p7vvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.727382 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-scripts" (OuterVolumeSpecName: "scripts") pod "a9b88447-1cdb-4666-a4c2-31b7a0e7192f" (UID: "a9b88447-1cdb-4666-a4c2-31b7a0e7192f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.744625 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9b88447-1cdb-4666-a4c2-31b7a0e7192f" (UID: "a9b88447-1cdb-4666-a4c2-31b7a0e7192f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.749847 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-config-data" (OuterVolumeSpecName: "config-data") pod "a9b88447-1cdb-4666-a4c2-31b7a0e7192f" (UID: "a9b88447-1cdb-4666-a4c2-31b7a0e7192f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.777434 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "a9b88447-1cdb-4666-a4c2-31b7a0e7192f" (UID: "a9b88447-1cdb-4666-a4c2-31b7a0e7192f"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.806163 4974 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.806192 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.806202 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.806211 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p7vvd\" (UniqueName: \"kubernetes.io/projected/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-kube-api-access-p7vvd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.806222 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.806230 4974 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.806240 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9b88447-1cdb-4666-a4c2-31b7a0e7192f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.918166 4974 scope.go:117] "RemoveContainer" containerID="09cc0d62a371439bdad9579ae7fbafbb030801916febc2419d1d87c924749bdc" Oct 13 18:32:43 crc kubenswrapper[4974]: I1013 18:32:43.979563 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.012074 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0155adb4-e317-4927-913e-acd03779ad3f-logs\") pod \"0155adb4-e317-4927-913e-acd03779ad3f\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.012127 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0155adb4-e317-4927-913e-acd03779ad3f-httpd-run\") pod \"0155adb4-e317-4927-913e-acd03779ad3f\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.012253 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-combined-ca-bundle\") pod \"0155adb4-e317-4927-913e-acd03779ad3f\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.012345 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7d88\" (UniqueName: \"kubernetes.io/projected/0155adb4-e317-4927-913e-acd03779ad3f-kube-api-access-r7d88\") pod \"0155adb4-e317-4927-913e-acd03779ad3f\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.012381 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"0155adb4-e317-4927-913e-acd03779ad3f\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.012398 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-public-tls-certs\") pod \"0155adb4-e317-4927-913e-acd03779ad3f\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.012421 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-config-data\") pod \"0155adb4-e317-4927-913e-acd03779ad3f\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.012443 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-scripts\") pod \"0155adb4-e317-4927-913e-acd03779ad3f\" (UID: \"0155adb4-e317-4927-913e-acd03779ad3f\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.013904 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0155adb4-e317-4927-913e-acd03779ad3f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "0155adb4-e317-4927-913e-acd03779ad3f" (UID: "0155adb4-e317-4927-913e-acd03779ad3f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.014129 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0155adb4-e317-4927-913e-acd03779ad3f-logs" (OuterVolumeSpecName: "logs") pod "0155adb4-e317-4927-913e-acd03779ad3f" (UID: "0155adb4-e317-4927-913e-acd03779ad3f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.020812 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-scripts" (OuterVolumeSpecName: "scripts") pod "0155adb4-e317-4927-913e-acd03779ad3f" (UID: "0155adb4-e317-4927-913e-acd03779ad3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.023774 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0155adb4-e317-4927-913e-acd03779ad3f-kube-api-access-r7d88" (OuterVolumeSpecName: "kube-api-access-r7d88") pod "0155adb4-e317-4927-913e-acd03779ad3f" (UID: "0155adb4-e317-4927-913e-acd03779ad3f"). InnerVolumeSpecName "kube-api-access-r7d88". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.038201 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "0155adb4-e317-4927-913e-acd03779ad3f" (UID: "0155adb4-e317-4927-913e-acd03779ad3f"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.061515 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5796767b68-9dktc"] Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.064215 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5796767b68-9dktc"] Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.114050 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7d88\" (UniqueName: \"kubernetes.io/projected/0155adb4-e317-4927-913e-acd03779ad3f-kube-api-access-r7d88\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.114092 4974 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.114104 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.114114 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0155adb4-e317-4927-913e-acd03779ad3f-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.114122 4974 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0155adb4-e317-4927-913e-acd03779ad3f-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.135489 4974 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.137786 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0155adb4-e317-4927-913e-acd03779ad3f" (UID: "0155adb4-e317-4927-913e-acd03779ad3f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.138850 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.149702 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.153763 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-config-data" (OuterVolumeSpecName: "config-data") pod "0155adb4-e317-4927-913e-acd03779ad3f" (UID: "0155adb4-e317-4927-913e-acd03779ad3f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.194734 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "0155adb4-e317-4927-913e-acd03779ad3f" (UID: "0155adb4-e317-4927-913e-acd03779ad3f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.216706 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8b64be-7ed7-498e-8c8c-696641cefac0-log-httpd\") pod \"de8b64be-7ed7-498e-8c8c-696641cefac0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.216754 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xjb6h\" (UniqueName: \"kubernetes.io/projected/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-kube-api-access-xjb6h\") pod \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.216780 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-scripts\") pod \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.216802 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-internal-tls-certs\") pod \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.216834 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-sg-core-conf-yaml\") pod \"de8b64be-7ed7-498e-8c8c-696641cefac0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.216867 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-logs\") pod \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.216891 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8b64be-7ed7-498e-8c8c-696641cefac0-run-httpd\") pod \"de8b64be-7ed7-498e-8c8c-696641cefac0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.216924 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-scripts\") pod \"de8b64be-7ed7-498e-8c8c-696641cefac0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.216946 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-combined-ca-bundle\") pod \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.216961 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-combined-ca-bundle\") pod \"de8b64be-7ed7-498e-8c8c-696641cefac0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.217005 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfsqw\" (UniqueName: \"kubernetes.io/projected/de8b64be-7ed7-498e-8c8c-696641cefac0-kube-api-access-xfsqw\") pod \"de8b64be-7ed7-498e-8c8c-696641cefac0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.217036 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.217060 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-config-data\") pod \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.217084 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-config-data\") pod \"de8b64be-7ed7-498e-8c8c-696641cefac0\" (UID: \"de8b64be-7ed7-498e-8c8c-696641cefac0\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.217111 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-httpd-run\") pod \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\" (UID: \"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef\") " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.217402 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.217416 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.217426 4974 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.217435 4974 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0155adb4-e317-4927-913e-acd03779ad3f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.217791 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" (UID: "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.218094 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de8b64be-7ed7-498e-8c8c-696641cefac0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "de8b64be-7ed7-498e-8c8c-696641cefac0" (UID: "de8b64be-7ed7-498e-8c8c-696641cefac0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.220945 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-kube-api-access-xjb6h" (OuterVolumeSpecName: "kube-api-access-xjb6h") pod "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" (UID: "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef"). InnerVolumeSpecName "kube-api-access-xjb6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.225082 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-logs" (OuterVolumeSpecName: "logs") pod "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" (UID: "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.225362 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-scripts" (OuterVolumeSpecName: "scripts") pod "de8b64be-7ed7-498e-8c8c-696641cefac0" (UID: "de8b64be-7ed7-498e-8c8c-696641cefac0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.225679 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de8b64be-7ed7-498e-8c8c-696641cefac0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "de8b64be-7ed7-498e-8c8c-696641cefac0" (UID: "de8b64be-7ed7-498e-8c8c-696641cefac0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.236814 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8b64be-7ed7-498e-8c8c-696641cefac0-kube-api-access-xfsqw" (OuterVolumeSpecName: "kube-api-access-xfsqw") pod "de8b64be-7ed7-498e-8c8c-696641cefac0" (UID: "de8b64be-7ed7-498e-8c8c-696641cefac0"). InnerVolumeSpecName "kube-api-access-xfsqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.257382 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-scripts" (OuterVolumeSpecName: "scripts") pod "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" (UID: "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.267985 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" (UID: "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.315969 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" (UID: "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.318892 4974 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-httpd-run\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.318917 4974 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8b64be-7ed7-498e-8c8c-696641cefac0-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.318926 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xjb6h\" (UniqueName: \"kubernetes.io/projected/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-kube-api-access-xjb6h\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.318937 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.318946 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.318953 4974 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de8b64be-7ed7-498e-8c8c-696641cefac0-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.318961 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.318969 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.318976 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfsqw\" (UniqueName: \"kubernetes.io/projected/de8b64be-7ed7-498e-8c8c-696641cefac0-kube-api-access-xfsqw\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.319003 4974 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.337480 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "de8b64be-7ed7-498e-8c8c-696641cefac0" (UID: "de8b64be-7ed7-498e-8c8c-696641cefac0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.364672 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-config-data" (OuterVolumeSpecName: "config-data") pod "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" (UID: "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.366749 4974 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.369837 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de8b64be-7ed7-498e-8c8c-696641cefac0" (UID: "de8b64be-7ed7-498e-8c8c-696641cefac0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.371993 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" (UID: "54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.399668 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-config-data" (OuterVolumeSpecName: "config-data") pod "de8b64be-7ed7-498e-8c8c-696641cefac0" (UID: "de8b64be-7ed7-498e-8c8c-696641cefac0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.422097 4974 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.422131 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.422143 4974 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.422154 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.422163 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de8b64be-7ed7-498e-8c8c-696641cefac0-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.422171 4974 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.691454 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef","Type":"ContainerDied","Data":"84cb0d9ee8808f06b9db022d7e0f557179b8429e48ee7593f67deabc2633735a"} Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.691530 4974 scope.go:117] "RemoveContainer" containerID="7cfa5f9549b149d2372038f61a90b9cb84b54f65fbc44fc06b9a3e6d8068dabc" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.691713 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.700246 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0155adb4-e317-4927-913e-acd03779ad3f","Type":"ContainerDied","Data":"34de5f8990d443d98bb8edc72eb36cf700ebfd21a276a09306c64a3fcffcc5e5"} Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.700358 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.709344 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de8b64be-7ed7-498e-8c8c-696641cefac0","Type":"ContainerDied","Data":"3f8006df3b1e0a609f6493d47889bcf61e623a9b5e2adfd781469a1077aa598f"} Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.709483 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.714396 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"e9e672e2-a15c-4cfa-b751-c6208182f2c7","Type":"ContainerStarted","Data":"8d5eaa5a15467380e3ca8ac68b61b40747d931126712f84b91273b86160f3afc"} Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.728108 4974 scope.go:117] "RemoveContainer" containerID="8d528756029a83c4ee5b5f12ec17f1b17f8bf8b7eb37ef95586ad4678d508192" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.736983 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.760624 4974 scope.go:117] "RemoveContainer" containerID="b686047b632d9fec8762385ed496d43fcc0f784ad09f4078123fe30ee3fbaa82" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.767771 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.783933 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.794544 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.795412 4974 scope.go:117] "RemoveContainer" containerID="6c85c3841126f94ae7a20b776a5878ca50b1c42baf21f49507a2384c5f5cab44" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.802380 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:44 crc kubenswrapper[4974]: E1013 18:32:44.802765 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" containerName="glance-httpd" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.802776 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" containerName="glance-httpd" Oct 13 18:32:44 crc kubenswrapper[4974]: E1013 18:32:44.802792 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerName="horizon-log" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.802799 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerName="horizon-log" Oct 13 18:32:44 crc kubenswrapper[4974]: E1013 18:32:44.802819 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0155adb4-e317-4927-913e-acd03779ad3f" containerName="glance-httpd" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.802826 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="0155adb4-e317-4927-913e-acd03779ad3f" containerName="glance-httpd" Oct 13 18:32:44 crc kubenswrapper[4974]: E1013 18:32:44.802840 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerName="horizon" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.802846 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerName="horizon" Oct 13 18:32:44 crc kubenswrapper[4974]: E1013 18:32:44.802862 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" containerName="glance-log" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.802868 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" containerName="glance-log" Oct 13 18:32:44 crc kubenswrapper[4974]: E1013 18:32:44.802880 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="sg-core" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.802886 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="sg-core" Oct 13 18:32:44 crc kubenswrapper[4974]: E1013 18:32:44.802900 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="ceilometer-central-agent" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.802906 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="ceilometer-central-agent" Oct 13 18:32:44 crc kubenswrapper[4974]: E1013 18:32:44.802917 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0155adb4-e317-4927-913e-acd03779ad3f" containerName="glance-log" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.802924 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="0155adb4-e317-4927-913e-acd03779ad3f" containerName="glance-log" Oct 13 18:32:44 crc kubenswrapper[4974]: E1013 18:32:44.802932 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="ceilometer-notification-agent" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.802938 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="ceilometer-notification-agent" Oct 13 18:32:44 crc kubenswrapper[4974]: E1013 18:32:44.802949 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="proxy-httpd" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.802955 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="proxy-httpd" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.803113 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="ceilometer-central-agent" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.803125 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" containerName="glance-httpd" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.803136 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="proxy-httpd" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.803146 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerName="horizon-log" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.803164 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="sg-core" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.803172 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" containerName="horizon" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.803184 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="0155adb4-e317-4927-913e-acd03779ad3f" containerName="glance-httpd" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.803195 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="0155adb4-e317-4927-913e-acd03779ad3f" containerName="glance-log" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.803205 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" containerName="glance-log" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.803212 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" containerName="ceilometer-notification-agent" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.804180 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.804279 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.153371672 podStartE2EDuration="15.80426037s" podCreationTimestamp="2025-10-13 18:32:29 +0000 UTC" firstStartedPulling="2025-10-13 18:32:30.880782518 +0000 UTC m=+1085.785148598" lastFinishedPulling="2025-10-13 18:32:43.531671216 +0000 UTC m=+1098.436037296" observedRunningTime="2025-10-13 18:32:44.765473927 +0000 UTC m=+1099.669840017" watchObservedRunningTime="2025-10-13 18:32:44.80426037 +0000 UTC m=+1099.708626450" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.809296 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.809437 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.809682 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-nt8rf" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.809827 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.860701 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.862368 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.874343 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.875091 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.890697 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.892528 4974 scope.go:117] "RemoveContainer" containerID="3ba800bfba56a3f6de20f2c2bbb8f095cd65f3a2be5bf2fc831729512d2b51c2" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.917167 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.933680 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.933993 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.934129 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.934235 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.934364 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w45f\" (UniqueName: \"kubernetes.io/projected/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-kube-api-access-9w45f\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.934463 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.934569 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.934733 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.955715 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:44 crc kubenswrapper[4974]: I1013 18:32:44.984897 4974 scope.go:117] "RemoveContainer" containerID="865f46c723fc4cf66dea891237fcca021a32994a5b824aa1990a0d70523060e1" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.001001 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.008719 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.024316 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.041970 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.044133 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.044434 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.044524 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.044564 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c989ea63-17f9-4aca-a407-9e07cbb1a04c-logs\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.044616 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c989ea63-17f9-4aca-a407-9e07cbb1a04c-config-data\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.044662 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w45f\" (UniqueName: \"kubernetes.io/projected/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-kube-api-access-9w45f\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.044691 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.044734 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.044788 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.044891 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c989ea63-17f9-4aca-a407-9e07cbb1a04c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.044928 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.044996 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c989ea63-17f9-4aca-a407-9e07cbb1a04c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.045056 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxkqx\" (UniqueName: \"kubernetes.io/projected/c989ea63-17f9-4aca-a407-9e07cbb1a04c-kube-api-access-fxkqx\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.045100 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c989ea63-17f9-4aca-a407-9e07cbb1a04c-scripts\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.045170 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.045219 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.045256 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c989ea63-17f9-4aca-a407-9e07cbb1a04c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.045303 4974 scope.go:117] "RemoveContainer" containerID="ad814f60b5e97c9cce0218748f54ad2a33efb248c11edb28fe8b60bf2e1a7732" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.046051 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.049834 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.049958 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.050137 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-logs\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.052049 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.052168 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.068265 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.076520 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w45f\" (UniqueName: \"kubernetes.io/projected/7f9b8239-9b8d-4e59-8ba6-b7d8b5959248-kube-api-access-9w45f\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.080824 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.081340 4974 scope.go:117] "RemoveContainer" containerID="9c34466a674d4714c78f4fc24a803a9dd0610b4bf6052834d361a342058d4785" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.093463 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248\") " pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.128337 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.147176 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.147358 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-config-data\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.147456 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.147536 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c989ea63-17f9-4aca-a407-9e07cbb1a04c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.147627 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxkqx\" (UniqueName: \"kubernetes.io/projected/c989ea63-17f9-4aca-a407-9e07cbb1a04c-kube-api-access-fxkqx\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.147749 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c989ea63-17f9-4aca-a407-9e07cbb1a04c-scripts\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.147920 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd03cb7a-09fc-43e9-8310-8a8841090c43-log-httpd\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.148087 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c989ea63-17f9-4aca-a407-9e07cbb1a04c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.148537 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd03cb7a-09fc-43e9-8310-8a8841090c43-run-httpd\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.148679 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c989ea63-17f9-4aca-a407-9e07cbb1a04c-logs\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.148872 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c989ea63-17f9-4aca-a407-9e07cbb1a04c-config-data\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.149264 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztwzw\" (UniqueName: \"kubernetes.io/projected/dd03cb7a-09fc-43e9-8310-8a8841090c43-kube-api-access-ztwzw\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.149349 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-scripts\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.149439 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.149529 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c989ea63-17f9-4aca-a407-9e07cbb1a04c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.150129 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c989ea63-17f9-4aca-a407-9e07cbb1a04c-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.150970 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.151558 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c989ea63-17f9-4aca-a407-9e07cbb1a04c-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.151887 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c989ea63-17f9-4aca-a407-9e07cbb1a04c-scripts\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.153798 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c989ea63-17f9-4aca-a407-9e07cbb1a04c-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.154937 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c989ea63-17f9-4aca-a407-9e07cbb1a04c-config-data\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.159169 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c989ea63-17f9-4aca-a407-9e07cbb1a04c-logs\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.163293 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxkqx\" (UniqueName: \"kubernetes.io/projected/c989ea63-17f9-4aca-a407-9e07cbb1a04c-kube-api-access-fxkqx\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.178153 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-external-api-0\" (UID: \"c989ea63-17f9-4aca-a407-9e07cbb1a04c\") " pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.231554 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.250874 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd03cb7a-09fc-43e9-8310-8a8841090c43-run-httpd\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.250971 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztwzw\" (UniqueName: \"kubernetes.io/projected/dd03cb7a-09fc-43e9-8310-8a8841090c43-kube-api-access-ztwzw\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.251012 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-scripts\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.251080 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.251107 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-config-data\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.251135 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.251193 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd03cb7a-09fc-43e9-8310-8a8841090c43-log-httpd\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.252021 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd03cb7a-09fc-43e9-8310-8a8841090c43-log-httpd\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.252271 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd03cb7a-09fc-43e9-8310-8a8841090c43-run-httpd\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.261349 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-scripts\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.267250 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-config-data\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.269166 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.270534 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztwzw\" (UniqueName: \"kubernetes.io/projected/dd03cb7a-09fc-43e9-8310-8a8841090c43-kube-api-access-ztwzw\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.291053 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.363242 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.705004 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.725695 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248","Type":"ContainerStarted","Data":"cab747efbcd027f9f2e0f40a1720cea710a7514f2b3c988a17de2cef6deaad3a"} Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.829038 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0155adb4-e317-4927-913e-acd03779ad3f" path="/var/lib/kubelet/pods/0155adb4-e317-4927-913e-acd03779ad3f/volumes" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.829877 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef" path="/var/lib/kubelet/pods/54e5c5a7-8245-4f6f-a8e0-8987ddb7f6ef/volumes" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.830826 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b88447-1cdb-4666-a4c2-31b7a0e7192f" path="/var/lib/kubelet/pods/a9b88447-1cdb-4666-a4c2-31b7a0e7192f/volumes" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.832057 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8b64be-7ed7-498e-8c8c-696641cefac0" path="/var/lib/kubelet/pods/de8b64be-7ed7-498e-8c8c-696641cefac0/volumes" Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.832804 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Oct 13 18:32:45 crc kubenswrapper[4974]: I1013 18:32:45.896447 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:45 crc kubenswrapper[4974]: W1013 18:32:45.903043 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd03cb7a_09fc_43e9_8310_8a8841090c43.slice/crio-367bf6e97d0da7a1d3da671beeea0229359a025de89ec42618405d9c873ee757 WatchSource:0}: Error finding container 367bf6e97d0da7a1d3da671beeea0229359a025de89ec42618405d9c873ee757: Status 404 returned error can't find the container with id 367bf6e97d0da7a1d3da671beeea0229359a025de89ec42618405d9c873ee757 Oct 13 18:32:46 crc kubenswrapper[4974]: I1013 18:32:46.291075 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:46 crc kubenswrapper[4974]: I1013 18:32:46.748448 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd03cb7a-09fc-43e9-8310-8a8841090c43","Type":"ContainerStarted","Data":"0eb0eb3be527351f205c0c832510cfcd31bdedf215a4a44f7502979e84a86cbb"} Oct 13 18:32:46 crc kubenswrapper[4974]: I1013 18:32:46.748839 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd03cb7a-09fc-43e9-8310-8a8841090c43","Type":"ContainerStarted","Data":"e4e2fd48574bd761c7f36b61f4299a4b4df8e7e00c37264f660dc1584e5899c1"} Oct 13 18:32:46 crc kubenswrapper[4974]: I1013 18:32:46.748862 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd03cb7a-09fc-43e9-8310-8a8841090c43","Type":"ContainerStarted","Data":"367bf6e97d0da7a1d3da671beeea0229359a025de89ec42618405d9c873ee757"} Oct 13 18:32:46 crc kubenswrapper[4974]: I1013 18:32:46.755795 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248","Type":"ContainerStarted","Data":"ae6189e9e0859e4b8bef45020c1b3c76935c1475e2ee10c4c5a955891d5b4697"} Oct 13 18:32:46 crc kubenswrapper[4974]: I1013 18:32:46.760795 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c989ea63-17f9-4aca-a407-9e07cbb1a04c","Type":"ContainerStarted","Data":"139b918571106a6f20aa6aa24eac14b792667173f74adf37b3fef107d97282e0"} Oct 13 18:32:46 crc kubenswrapper[4974]: I1013 18:32:46.760856 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c989ea63-17f9-4aca-a407-9e07cbb1a04c","Type":"ContainerStarted","Data":"60621af16d297fc61ad40aacf7b04d6b3f58afd81597a901e91de44a1a2f0c1a"} Oct 13 18:32:47 crc kubenswrapper[4974]: I1013 18:32:47.773844 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"c989ea63-17f9-4aca-a407-9e07cbb1a04c","Type":"ContainerStarted","Data":"625b87cc4e4efc9653ba3da38da59f63cc863f311069f6c5b136774346daa012"} Oct 13 18:32:47 crc kubenswrapper[4974]: I1013 18:32:47.777063 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd03cb7a-09fc-43e9-8310-8a8841090c43","Type":"ContainerStarted","Data":"b7f79f6b3db4cc132432565344c910f33e338a4e438f0afb6abf1a50a3bd3ab7"} Oct 13 18:32:47 crc kubenswrapper[4974]: I1013 18:32:47.779373 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7f9b8239-9b8d-4e59-8ba6-b7d8b5959248","Type":"ContainerStarted","Data":"42ce9e1deca2761dac2e6e492aecc245b28fb83101713172220b81065c04146e"} Oct 13 18:32:47 crc kubenswrapper[4974]: I1013 18:32:47.809564 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.809540909 podStartE2EDuration="3.809540909s" podCreationTimestamp="2025-10-13 18:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:47.79216867 +0000 UTC m=+1102.696534750" watchObservedRunningTime="2025-10-13 18:32:47.809540909 +0000 UTC m=+1102.713906989" Oct 13 18:32:47 crc kubenswrapper[4974]: I1013 18:32:47.826522 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.826507067 podStartE2EDuration="3.826507067s" podCreationTimestamp="2025-10-13 18:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:32:47.817106213 +0000 UTC m=+1102.721472293" watchObservedRunningTime="2025-10-13 18:32:47.826507067 +0000 UTC m=+1102.730873147" Oct 13 18:32:49 crc kubenswrapper[4974]: I1013 18:32:49.802507 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd03cb7a-09fc-43e9-8310-8a8841090c43","Type":"ContainerStarted","Data":"6e413a58dfacc4cc0cb02d10a0d42b36c2027f5aa17662a6ac662655f76ddc5a"} Oct 13 18:32:49 crc kubenswrapper[4974]: I1013 18:32:49.802989 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="ceilometer-central-agent" containerID="cri-o://e4e2fd48574bd761c7f36b61f4299a4b4df8e7e00c37264f660dc1584e5899c1" gracePeriod=30 Oct 13 18:32:49 crc kubenswrapper[4974]: I1013 18:32:49.803236 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 18:32:49 crc kubenswrapper[4974]: I1013 18:32:49.803334 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="proxy-httpd" containerID="cri-o://6e413a58dfacc4cc0cb02d10a0d42b36c2027f5aa17662a6ac662655f76ddc5a" gracePeriod=30 Oct 13 18:32:49 crc kubenswrapper[4974]: I1013 18:32:49.803455 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="sg-core" containerID="cri-o://b7f79f6b3db4cc132432565344c910f33e338a4e438f0afb6abf1a50a3bd3ab7" gracePeriod=30 Oct 13 18:32:49 crc kubenswrapper[4974]: I1013 18:32:49.803519 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="ceilometer-notification-agent" containerID="cri-o://0eb0eb3be527351f205c0c832510cfcd31bdedf215a4a44f7502979e84a86cbb" gracePeriod=30 Oct 13 18:32:49 crc kubenswrapper[4974]: I1013 18:32:49.832770 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.044804082 podStartE2EDuration="5.832753673s" podCreationTimestamp="2025-10-13 18:32:44 +0000 UTC" firstStartedPulling="2025-10-13 18:32:45.905764061 +0000 UTC m=+1100.810130141" lastFinishedPulling="2025-10-13 18:32:48.693713642 +0000 UTC m=+1103.598079732" observedRunningTime="2025-10-13 18:32:49.829005708 +0000 UTC m=+1104.733371798" watchObservedRunningTime="2025-10-13 18:32:49.832753673 +0000 UTC m=+1104.737119763" Oct 13 18:32:49 crc kubenswrapper[4974]: I1013 18:32:49.987518 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-65hg2"] Oct 13 18:32:49 crc kubenswrapper[4974]: I1013 18:32:49.989163 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-65hg2" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.004713 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-65hg2"] Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.044048 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x67n6\" (UniqueName: \"kubernetes.io/projected/2faa46ff-30c6-4bc7-9b04-69088774b56d-kube-api-access-x67n6\") pod \"nova-api-db-create-65hg2\" (UID: \"2faa46ff-30c6-4bc7-9b04-69088774b56d\") " pod="openstack/nova-api-db-create-65hg2" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.080287 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-j6b9d"] Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.081410 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j6b9d" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.091820 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-j6b9d"] Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.146041 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x67n6\" (UniqueName: \"kubernetes.io/projected/2faa46ff-30c6-4bc7-9b04-69088774b56d-kube-api-access-x67n6\") pod \"nova-api-db-create-65hg2\" (UID: \"2faa46ff-30c6-4bc7-9b04-69088774b56d\") " pod="openstack/nova-api-db-create-65hg2" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.146131 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97vqb\" (UniqueName: \"kubernetes.io/projected/27f6985f-10d9-40de-8441-8289ed83515c-kube-api-access-97vqb\") pod \"nova-cell0-db-create-j6b9d\" (UID: \"27f6985f-10d9-40de-8441-8289ed83515c\") " pod="openstack/nova-cell0-db-create-j6b9d" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.168078 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x67n6\" (UniqueName: \"kubernetes.io/projected/2faa46ff-30c6-4bc7-9b04-69088774b56d-kube-api-access-x67n6\") pod \"nova-api-db-create-65hg2\" (UID: \"2faa46ff-30c6-4bc7-9b04-69088774b56d\") " pod="openstack/nova-api-db-create-65hg2" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.248629 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97vqb\" (UniqueName: \"kubernetes.io/projected/27f6985f-10d9-40de-8441-8289ed83515c-kube-api-access-97vqb\") pod \"nova-cell0-db-create-j6b9d\" (UID: \"27f6985f-10d9-40de-8441-8289ed83515c\") " pod="openstack/nova-cell0-db-create-j6b9d" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.274149 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97vqb\" (UniqueName: \"kubernetes.io/projected/27f6985f-10d9-40de-8441-8289ed83515c-kube-api-access-97vqb\") pod \"nova-cell0-db-create-j6b9d\" (UID: \"27f6985f-10d9-40de-8441-8289ed83515c\") " pod="openstack/nova-cell0-db-create-j6b9d" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.293381 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-wqqtz"] Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.309218 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-65hg2" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.318350 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wqqtz" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.371928 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wqqtz"] Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.411079 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j6b9d" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.459269 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48lgs\" (UniqueName: \"kubernetes.io/projected/8a86a831-6077-4935-b8ce-f48755ebe658-kube-api-access-48lgs\") pod \"nova-cell1-db-create-wqqtz\" (UID: \"8a86a831-6077-4935-b8ce-f48755ebe658\") " pod="openstack/nova-cell1-db-create-wqqtz" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.560600 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48lgs\" (UniqueName: \"kubernetes.io/projected/8a86a831-6077-4935-b8ce-f48755ebe658-kube-api-access-48lgs\") pod \"nova-cell1-db-create-wqqtz\" (UID: \"8a86a831-6077-4935-b8ce-f48755ebe658\") " pod="openstack/nova-cell1-db-create-wqqtz" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.581294 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48lgs\" (UniqueName: \"kubernetes.io/projected/8a86a831-6077-4935-b8ce-f48755ebe658-kube-api-access-48lgs\") pod \"nova-cell1-db-create-wqqtz\" (UID: \"8a86a831-6077-4935-b8ce-f48755ebe658\") " pod="openstack/nova-cell1-db-create-wqqtz" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.673439 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wqqtz" Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.820452 4974 generic.go:334] "Generic (PLEG): container finished" podID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerID="6e413a58dfacc4cc0cb02d10a0d42b36c2027f5aa17662a6ac662655f76ddc5a" exitCode=0 Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.820698 4974 generic.go:334] "Generic (PLEG): container finished" podID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerID="b7f79f6b3db4cc132432565344c910f33e338a4e438f0afb6abf1a50a3bd3ab7" exitCode=2 Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.820708 4974 generic.go:334] "Generic (PLEG): container finished" podID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerID="0eb0eb3be527351f205c0c832510cfcd31bdedf215a4a44f7502979e84a86cbb" exitCode=0 Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.820714 4974 generic.go:334] "Generic (PLEG): container finished" podID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerID="e4e2fd48574bd761c7f36b61f4299a4b4df8e7e00c37264f660dc1584e5899c1" exitCode=0 Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.820734 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd03cb7a-09fc-43e9-8310-8a8841090c43","Type":"ContainerDied","Data":"6e413a58dfacc4cc0cb02d10a0d42b36c2027f5aa17662a6ac662655f76ddc5a"} Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.820759 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd03cb7a-09fc-43e9-8310-8a8841090c43","Type":"ContainerDied","Data":"b7f79f6b3db4cc132432565344c910f33e338a4e438f0afb6abf1a50a3bd3ab7"} Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.820768 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd03cb7a-09fc-43e9-8310-8a8841090c43","Type":"ContainerDied","Data":"0eb0eb3be527351f205c0c832510cfcd31bdedf215a4a44f7502979e84a86cbb"} Oct 13 18:32:50 crc kubenswrapper[4974]: I1013 18:32:50.820775 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd03cb7a-09fc-43e9-8310-8a8841090c43","Type":"ContainerDied","Data":"e4e2fd48574bd761c7f36b61f4299a4b4df8e7e00c37264f660dc1584e5899c1"} Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.021870 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.052927 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-65hg2"] Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.069723 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd03cb7a-09fc-43e9-8310-8a8841090c43-run-httpd\") pod \"dd03cb7a-09fc-43e9-8310-8a8841090c43\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.069782 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-scripts\") pod \"dd03cb7a-09fc-43e9-8310-8a8841090c43\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.069838 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-config-data\") pod \"dd03cb7a-09fc-43e9-8310-8a8841090c43\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.069874 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztwzw\" (UniqueName: \"kubernetes.io/projected/dd03cb7a-09fc-43e9-8310-8a8841090c43-kube-api-access-ztwzw\") pod \"dd03cb7a-09fc-43e9-8310-8a8841090c43\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.070343 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd03cb7a-09fc-43e9-8310-8a8841090c43-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dd03cb7a-09fc-43e9-8310-8a8841090c43" (UID: "dd03cb7a-09fc-43e9-8310-8a8841090c43"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.070960 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-sg-core-conf-yaml\") pod \"dd03cb7a-09fc-43e9-8310-8a8841090c43\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.071214 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-combined-ca-bundle\") pod \"dd03cb7a-09fc-43e9-8310-8a8841090c43\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.071494 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd03cb7a-09fc-43e9-8310-8a8841090c43-log-httpd\") pod \"dd03cb7a-09fc-43e9-8310-8a8841090c43\" (UID: \"dd03cb7a-09fc-43e9-8310-8a8841090c43\") " Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.072562 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd03cb7a-09fc-43e9-8310-8a8841090c43-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dd03cb7a-09fc-43e9-8310-8a8841090c43" (UID: "dd03cb7a-09fc-43e9-8310-8a8841090c43"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.073149 4974 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd03cb7a-09fc-43e9-8310-8a8841090c43-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.073165 4974 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd03cb7a-09fc-43e9-8310-8a8841090c43-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.076306 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-scripts" (OuterVolumeSpecName: "scripts") pod "dd03cb7a-09fc-43e9-8310-8a8841090c43" (UID: "dd03cb7a-09fc-43e9-8310-8a8841090c43"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.076423 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd03cb7a-09fc-43e9-8310-8a8841090c43-kube-api-access-ztwzw" (OuterVolumeSpecName: "kube-api-access-ztwzw") pod "dd03cb7a-09fc-43e9-8310-8a8841090c43" (UID: "dd03cb7a-09fc-43e9-8310-8a8841090c43"). InnerVolumeSpecName "kube-api-access-ztwzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.105539 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dd03cb7a-09fc-43e9-8310-8a8841090c43" (UID: "dd03cb7a-09fc-43e9-8310-8a8841090c43"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.181503 4974 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.181792 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.181804 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztwzw\" (UniqueName: \"kubernetes.io/projected/dd03cb7a-09fc-43e9-8310-8a8841090c43-kube-api-access-ztwzw\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.195348 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd03cb7a-09fc-43e9-8310-8a8841090c43" (UID: "dd03cb7a-09fc-43e9-8310-8a8841090c43"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.198825 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-j6b9d"] Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.226127 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-config-data" (OuterVolumeSpecName: "config-data") pod "dd03cb7a-09fc-43e9-8310-8a8841090c43" (UID: "dd03cb7a-09fc-43e9-8310-8a8841090c43"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.283502 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.283533 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd03cb7a-09fc-43e9-8310-8a8841090c43-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.312129 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-wqqtz"] Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.811928 4974 scope.go:117] "RemoveContainer" containerID="98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d" Oct 13 18:32:51 crc kubenswrapper[4974]: E1013 18:32:51.812422 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 20s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(89790087-1d9c-4278-b62f-e18a94775048)\"" pod="openstack/watcher-decision-engine-0" podUID="89790087-1d9c-4278-b62f-e18a94775048" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.840571 4974 generic.go:334] "Generic (PLEG): container finished" podID="27f6985f-10d9-40de-8441-8289ed83515c" containerID="a19a145f311b1b683a2de53c61ce02eaf50556bf4cba22a2ba3092fd81bf65e6" exitCode=0 Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.840710 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j6b9d" event={"ID":"27f6985f-10d9-40de-8441-8289ed83515c","Type":"ContainerDied","Data":"a19a145f311b1b683a2de53c61ce02eaf50556bf4cba22a2ba3092fd81bf65e6"} Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.840754 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j6b9d" event={"ID":"27f6985f-10d9-40de-8441-8289ed83515c","Type":"ContainerStarted","Data":"bfcde637981d5edb1412abc749cc6cf2e7353861fa12485a98587fe3e069e218"} Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.844429 4974 generic.go:334] "Generic (PLEG): container finished" podID="8a86a831-6077-4935-b8ce-f48755ebe658" containerID="3a154b3aab623f21b5fffe856a6752def8ce1bdb6b4174d40014dc50108ef0da" exitCode=0 Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.844551 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wqqtz" event={"ID":"8a86a831-6077-4935-b8ce-f48755ebe658","Type":"ContainerDied","Data":"3a154b3aab623f21b5fffe856a6752def8ce1bdb6b4174d40014dc50108ef0da"} Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.844602 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wqqtz" event={"ID":"8a86a831-6077-4935-b8ce-f48755ebe658","Type":"ContainerStarted","Data":"e3437081f551f8ac0a30f02a7dbbebc4df9e0d1aeff1d83593d8ce9228b59774"} Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.849002 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.849008 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd03cb7a-09fc-43e9-8310-8a8841090c43","Type":"ContainerDied","Data":"367bf6e97d0da7a1d3da671beeea0229359a025de89ec42618405d9c873ee757"} Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.849104 4974 scope.go:117] "RemoveContainer" containerID="6e413a58dfacc4cc0cb02d10a0d42b36c2027f5aa17662a6ac662655f76ddc5a" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.853880 4974 generic.go:334] "Generic (PLEG): container finished" podID="2faa46ff-30c6-4bc7-9b04-69088774b56d" containerID="cb92217511c837964b294f01447b13c96c428d64da142805b454fdb92a2264f6" exitCode=0 Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.853939 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-65hg2" event={"ID":"2faa46ff-30c6-4bc7-9b04-69088774b56d","Type":"ContainerDied","Data":"cb92217511c837964b294f01447b13c96c428d64da142805b454fdb92a2264f6"} Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.853971 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-65hg2" event={"ID":"2faa46ff-30c6-4bc7-9b04-69088774b56d","Type":"ContainerStarted","Data":"ea9c8dc10b84a4152a636f866c68b8ceada8894071a63112592ecab0bcaeede0"} Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.878957 4974 scope.go:117] "RemoveContainer" containerID="b7f79f6b3db4cc132432565344c910f33e338a4e438f0afb6abf1a50a3bd3ab7" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.907202 4974 scope.go:117] "RemoveContainer" containerID="0eb0eb3be527351f205c0c832510cfcd31bdedf215a4a44f7502979e84a86cbb" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.936478 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.957487 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.997966 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:51 crc kubenswrapper[4974]: E1013 18:32:51.998408 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="ceilometer-central-agent" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.998424 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="ceilometer-central-agent" Oct 13 18:32:51 crc kubenswrapper[4974]: E1013 18:32:51.998455 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="proxy-httpd" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.998464 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="proxy-httpd" Oct 13 18:32:51 crc kubenswrapper[4974]: E1013 18:32:51.998486 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="sg-core" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.998495 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="sg-core" Oct 13 18:32:51 crc kubenswrapper[4974]: E1013 18:32:51.998527 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="ceilometer-notification-agent" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.998536 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="ceilometer-notification-agent" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.998772 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="ceilometer-central-agent" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.998807 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="ceilometer-notification-agent" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.998826 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="sg-core" Oct 13 18:32:51 crc kubenswrapper[4974]: I1013 18:32:51.998842 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" containerName="proxy-httpd" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.002092 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.002218 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.004483 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.008491 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.008973 4974 scope.go:117] "RemoveContainer" containerID="e4e2fd48574bd761c7f36b61f4299a4b4df8e7e00c37264f660dc1584e5899c1" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.108821 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f1adfb3-3344-4215-a02c-ecfc250687da-log-httpd\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.108889 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.108936 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-scripts\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.109057 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-config-data\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.109107 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v7xv\" (UniqueName: \"kubernetes.io/projected/6f1adfb3-3344-4215-a02c-ecfc250687da-kube-api-access-9v7xv\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.109137 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.109302 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f1adfb3-3344-4215-a02c-ecfc250687da-run-httpd\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.211746 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f1adfb3-3344-4215-a02c-ecfc250687da-run-httpd\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.211823 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f1adfb3-3344-4215-a02c-ecfc250687da-log-httpd\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.211855 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.211884 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-scripts\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.211967 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-config-data\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.212007 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v7xv\" (UniqueName: \"kubernetes.io/projected/6f1adfb3-3344-4215-a02c-ecfc250687da-kube-api-access-9v7xv\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.212030 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.212715 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f1adfb3-3344-4215-a02c-ecfc250687da-log-httpd\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.213364 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f1adfb3-3344-4215-a02c-ecfc250687da-run-httpd\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.217566 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-scripts\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.218227 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.226636 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.227352 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-config-data\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.241828 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v7xv\" (UniqueName: \"kubernetes.io/projected/6f1adfb3-3344-4215-a02c-ecfc250687da-kube-api-access-9v7xv\") pod \"ceilometer-0\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.322337 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:52 crc kubenswrapper[4974]: I1013 18:32:52.867993 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:52 crc kubenswrapper[4974]: W1013 18:32:52.874669 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f1adfb3_3344_4215_a02c_ecfc250687da.slice/crio-4a6a5e314ae915e4695de4408ea01d0e219c829dcd2e967d821a2994f37866a6 WatchSource:0}: Error finding container 4a6a5e314ae915e4695de4408ea01d0e219c829dcd2e967d821a2994f37866a6: Status 404 returned error can't find the container with id 4a6a5e314ae915e4695de4408ea01d0e219c829dcd2e967d821a2994f37866a6 Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.425885 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j6b9d" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.573871 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97vqb\" (UniqueName: \"kubernetes.io/projected/27f6985f-10d9-40de-8441-8289ed83515c-kube-api-access-97vqb\") pod \"27f6985f-10d9-40de-8441-8289ed83515c\" (UID: \"27f6985f-10d9-40de-8441-8289ed83515c\") " Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.580811 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27f6985f-10d9-40de-8441-8289ed83515c-kube-api-access-97vqb" (OuterVolumeSpecName: "kube-api-access-97vqb") pod "27f6985f-10d9-40de-8441-8289ed83515c" (UID: "27f6985f-10d9-40de-8441-8289ed83515c"). InnerVolumeSpecName "kube-api-access-97vqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.581769 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-65hg2" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.646229 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wqqtz" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.676021 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x67n6\" (UniqueName: \"kubernetes.io/projected/2faa46ff-30c6-4bc7-9b04-69088774b56d-kube-api-access-x67n6\") pod \"2faa46ff-30c6-4bc7-9b04-69088774b56d\" (UID: \"2faa46ff-30c6-4bc7-9b04-69088774b56d\") " Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.676533 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97vqb\" (UniqueName: \"kubernetes.io/projected/27f6985f-10d9-40de-8441-8289ed83515c-kube-api-access-97vqb\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.679818 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2faa46ff-30c6-4bc7-9b04-69088774b56d-kube-api-access-x67n6" (OuterVolumeSpecName: "kube-api-access-x67n6") pod "2faa46ff-30c6-4bc7-9b04-69088774b56d" (UID: "2faa46ff-30c6-4bc7-9b04-69088774b56d"). InnerVolumeSpecName "kube-api-access-x67n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.777842 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48lgs\" (UniqueName: \"kubernetes.io/projected/8a86a831-6077-4935-b8ce-f48755ebe658-kube-api-access-48lgs\") pod \"8a86a831-6077-4935-b8ce-f48755ebe658\" (UID: \"8a86a831-6077-4935-b8ce-f48755ebe658\") " Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.778273 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x67n6\" (UniqueName: \"kubernetes.io/projected/2faa46ff-30c6-4bc7-9b04-69088774b56d-kube-api-access-x67n6\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.781787 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a86a831-6077-4935-b8ce-f48755ebe658-kube-api-access-48lgs" (OuterVolumeSpecName: "kube-api-access-48lgs") pod "8a86a831-6077-4935-b8ce-f48755ebe658" (UID: "8a86a831-6077-4935-b8ce-f48755ebe658"). InnerVolumeSpecName "kube-api-access-48lgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.825139 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd03cb7a-09fc-43e9-8310-8a8841090c43" path="/var/lib/kubelet/pods/dd03cb7a-09fc-43e9-8310-8a8841090c43/volumes" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.880939 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48lgs\" (UniqueName: \"kubernetes.io/projected/8a86a831-6077-4935-b8ce-f48755ebe658-kube-api-access-48lgs\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.893952 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-wqqtz" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.894120 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-wqqtz" event={"ID":"8a86a831-6077-4935-b8ce-f48755ebe658","Type":"ContainerDied","Data":"e3437081f551f8ac0a30f02a7dbbebc4df9e0d1aeff1d83593d8ce9228b59774"} Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.894164 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3437081f551f8ac0a30f02a7dbbebc4df9e0d1aeff1d83593d8ce9228b59774" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.895703 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f1adfb3-3344-4215-a02c-ecfc250687da","Type":"ContainerStarted","Data":"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a"} Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.895735 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f1adfb3-3344-4215-a02c-ecfc250687da","Type":"ContainerStarted","Data":"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3"} Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.895749 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f1adfb3-3344-4215-a02c-ecfc250687da","Type":"ContainerStarted","Data":"4a6a5e314ae915e4695de4408ea01d0e219c829dcd2e967d821a2994f37866a6"} Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.897967 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-65hg2" event={"ID":"2faa46ff-30c6-4bc7-9b04-69088774b56d","Type":"ContainerDied","Data":"ea9c8dc10b84a4152a636f866c68b8ceada8894071a63112592ecab0bcaeede0"} Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.897992 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea9c8dc10b84a4152a636f866c68b8ceada8894071a63112592ecab0bcaeede0" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.898077 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-65hg2" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.903717 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j6b9d" event={"ID":"27f6985f-10d9-40de-8441-8289ed83515c","Type":"ContainerDied","Data":"bfcde637981d5edb1412abc749cc6cf2e7353861fa12485a98587fe3e069e218"} Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.903751 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfcde637981d5edb1412abc749cc6cf2e7353861fa12485a98587fe3e069e218" Oct 13 18:32:53 crc kubenswrapper[4974]: I1013 18:32:53.903797 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j6b9d" Oct 13 18:32:54 crc kubenswrapper[4974]: I1013 18:32:54.915254 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f1adfb3-3344-4215-a02c-ecfc250687da","Type":"ContainerStarted","Data":"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3"} Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.129287 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.129350 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.164998 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.194993 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.233478 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.233519 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.249783 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.277713 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.293311 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.929429 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.930263 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.930278 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:55 crc kubenswrapper[4974]: I1013 18:32:55.930289 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:56 crc kubenswrapper[4974]: I1013 18:32:56.939581 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="ceilometer-central-agent" containerID="cri-o://6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3" gracePeriod=30 Oct 13 18:32:56 crc kubenswrapper[4974]: I1013 18:32:56.940198 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f1adfb3-3344-4215-a02c-ecfc250687da","Type":"ContainerStarted","Data":"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267"} Oct 13 18:32:56 crc kubenswrapper[4974]: I1013 18:32:56.940757 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 18:32:56 crc kubenswrapper[4974]: I1013 18:32:56.941008 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="proxy-httpd" containerID="cri-o://186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267" gracePeriod=30 Oct 13 18:32:56 crc kubenswrapper[4974]: I1013 18:32:56.941061 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="sg-core" containerID="cri-o://65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3" gracePeriod=30 Oct 13 18:32:56 crc kubenswrapper[4974]: I1013 18:32:56.941092 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="ceilometer-notification-agent" containerID="cri-o://e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a" gracePeriod=30 Oct 13 18:32:56 crc kubenswrapper[4974]: I1013 18:32:56.967973 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.94415734 podStartE2EDuration="5.967958354s" podCreationTimestamp="2025-10-13 18:32:51 +0000 UTC" firstStartedPulling="2025-10-13 18:32:52.877064965 +0000 UTC m=+1107.781431045" lastFinishedPulling="2025-10-13 18:32:55.900865979 +0000 UTC m=+1110.805232059" observedRunningTime="2025-10-13 18:32:56.96322109 +0000 UTC m=+1111.867587180" watchObservedRunningTime="2025-10-13 18:32:56.967958354 +0000 UTC m=+1111.872324434" Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.805299 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.875145 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.961302 4974 generic.go:334] "Generic (PLEG): container finished" podID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerID="186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267" exitCode=0 Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.961332 4974 generic.go:334] "Generic (PLEG): container finished" podID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerID="65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3" exitCode=2 Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.961340 4974 generic.go:334] "Generic (PLEG): container finished" podID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerID="e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a" exitCode=0 Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.961347 4974 generic.go:334] "Generic (PLEG): container finished" podID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerID="6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3" exitCode=0 Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.961414 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.961483 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.961530 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f1adfb3-3344-4215-a02c-ecfc250687da","Type":"ContainerDied","Data":"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267"} Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.961560 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f1adfb3-3344-4215-a02c-ecfc250687da","Type":"ContainerDied","Data":"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3"} Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.961572 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f1adfb3-3344-4215-a02c-ecfc250687da","Type":"ContainerDied","Data":"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a"} Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.961581 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f1adfb3-3344-4215-a02c-ecfc250687da","Type":"ContainerDied","Data":"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3"} Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.961593 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6f1adfb3-3344-4215-a02c-ecfc250687da","Type":"ContainerDied","Data":"4a6a5e314ae915e4695de4408ea01d0e219c829dcd2e967d821a2994f37866a6"} Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.961614 4974 scope.go:117] "RemoveContainer" containerID="186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267" Oct 13 18:32:57 crc kubenswrapper[4974]: I1013 18:32:57.993588 4974 scope.go:117] "RemoveContainer" containerID="65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.000165 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-sg-core-conf-yaml\") pod \"6f1adfb3-3344-4215-a02c-ecfc250687da\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.000296 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-scripts\") pod \"6f1adfb3-3344-4215-a02c-ecfc250687da\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.000383 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f1adfb3-3344-4215-a02c-ecfc250687da-run-httpd\") pod \"6f1adfb3-3344-4215-a02c-ecfc250687da\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.000405 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-config-data\") pod \"6f1adfb3-3344-4215-a02c-ecfc250687da\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.000460 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-combined-ca-bundle\") pod \"6f1adfb3-3344-4215-a02c-ecfc250687da\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.000478 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f1adfb3-3344-4215-a02c-ecfc250687da-log-httpd\") pod \"6f1adfb3-3344-4215-a02c-ecfc250687da\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.000519 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v7xv\" (UniqueName: \"kubernetes.io/projected/6f1adfb3-3344-4215-a02c-ecfc250687da-kube-api-access-9v7xv\") pod \"6f1adfb3-3344-4215-a02c-ecfc250687da\" (UID: \"6f1adfb3-3344-4215-a02c-ecfc250687da\") " Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.001980 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f1adfb3-3344-4215-a02c-ecfc250687da-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6f1adfb3-3344-4215-a02c-ecfc250687da" (UID: "6f1adfb3-3344-4215-a02c-ecfc250687da"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.002135 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f1adfb3-3344-4215-a02c-ecfc250687da-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6f1adfb3-3344-4215-a02c-ecfc250687da" (UID: "6f1adfb3-3344-4215-a02c-ecfc250687da"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.005372 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-scripts" (OuterVolumeSpecName: "scripts") pod "6f1adfb3-3344-4215-a02c-ecfc250687da" (UID: "6f1adfb3-3344-4215-a02c-ecfc250687da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.011210 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f1adfb3-3344-4215-a02c-ecfc250687da-kube-api-access-9v7xv" (OuterVolumeSpecName: "kube-api-access-9v7xv") pod "6f1adfb3-3344-4215-a02c-ecfc250687da" (UID: "6f1adfb3-3344-4215-a02c-ecfc250687da"). InnerVolumeSpecName "kube-api-access-9v7xv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.018754 4974 scope.go:117] "RemoveContainer" containerID="e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.025340 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.025414 4974 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.027611 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.033453 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6f1adfb3-3344-4215-a02c-ecfc250687da" (UID: "6f1adfb3-3344-4215-a02c-ecfc250687da"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.056839 4974 scope.go:117] "RemoveContainer" containerID="6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.100350 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.109436 4974 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f1adfb3-3344-4215-a02c-ecfc250687da-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.109568 4974 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6f1adfb3-3344-4215-a02c-ecfc250687da-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.109582 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v7xv\" (UniqueName: \"kubernetes.io/projected/6f1adfb3-3344-4215-a02c-ecfc250687da-kube-api-access-9v7xv\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.109594 4974 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.109672 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.172292 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f1adfb3-3344-4215-a02c-ecfc250687da" (UID: "6f1adfb3-3344-4215-a02c-ecfc250687da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.196099 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-config-data" (OuterVolumeSpecName: "config-data") pod "6f1adfb3-3344-4215-a02c-ecfc250687da" (UID: "6f1adfb3-3344-4215-a02c-ecfc250687da"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.211218 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.211253 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f1adfb3-3344-4215-a02c-ecfc250687da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.267979 4974 scope.go:117] "RemoveContainer" containerID="186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267" Oct 13 18:32:58 crc kubenswrapper[4974]: E1013 18:32:58.268439 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267\": container with ID starting with 186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267 not found: ID does not exist" containerID="186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.268474 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267"} err="failed to get container status \"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267\": rpc error: code = NotFound desc = could not find container \"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267\": container with ID starting with 186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267 not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.268496 4974 scope.go:117] "RemoveContainer" containerID="65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3" Oct 13 18:32:58 crc kubenswrapper[4974]: E1013 18:32:58.268691 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3\": container with ID starting with 65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3 not found: ID does not exist" containerID="65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.268712 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3"} err="failed to get container status \"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3\": rpc error: code = NotFound desc = could not find container \"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3\": container with ID starting with 65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3 not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.268726 4974 scope.go:117] "RemoveContainer" containerID="e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a" Oct 13 18:32:58 crc kubenswrapper[4974]: E1013 18:32:58.268915 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a\": container with ID starting with e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a not found: ID does not exist" containerID="e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.268933 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a"} err="failed to get container status \"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a\": rpc error: code = NotFound desc = could not find container \"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a\": container with ID starting with e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.268944 4974 scope.go:117] "RemoveContainer" containerID="6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3" Oct 13 18:32:58 crc kubenswrapper[4974]: E1013 18:32:58.269242 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3\": container with ID starting with 6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3 not found: ID does not exist" containerID="6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.269262 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3"} err="failed to get container status \"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3\": rpc error: code = NotFound desc = could not find container \"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3\": container with ID starting with 6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3 not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.269274 4974 scope.go:117] "RemoveContainer" containerID="186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.269472 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267"} err="failed to get container status \"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267\": rpc error: code = NotFound desc = could not find container \"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267\": container with ID starting with 186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267 not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.269490 4974 scope.go:117] "RemoveContainer" containerID="65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.269642 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3"} err="failed to get container status \"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3\": rpc error: code = NotFound desc = could not find container \"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3\": container with ID starting with 65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3 not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.269672 4974 scope.go:117] "RemoveContainer" containerID="e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.269848 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a"} err="failed to get container status \"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a\": rpc error: code = NotFound desc = could not find container \"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a\": container with ID starting with e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.269864 4974 scope.go:117] "RemoveContainer" containerID="6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.270015 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3"} err="failed to get container status \"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3\": rpc error: code = NotFound desc = could not find container \"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3\": container with ID starting with 6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3 not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.270031 4974 scope.go:117] "RemoveContainer" containerID="186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.270238 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267"} err="failed to get container status \"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267\": rpc error: code = NotFound desc = could not find container \"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267\": container with ID starting with 186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267 not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.270256 4974 scope.go:117] "RemoveContainer" containerID="65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.271565 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3"} err="failed to get container status \"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3\": rpc error: code = NotFound desc = could not find container \"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3\": container with ID starting with 65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3 not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.271584 4974 scope.go:117] "RemoveContainer" containerID="e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.271770 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a"} err="failed to get container status \"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a\": rpc error: code = NotFound desc = could not find container \"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a\": container with ID starting with e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.271788 4974 scope.go:117] "RemoveContainer" containerID="6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.272034 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3"} err="failed to get container status \"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3\": rpc error: code = NotFound desc = could not find container \"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3\": container with ID starting with 6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3 not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.272091 4974 scope.go:117] "RemoveContainer" containerID="186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.272607 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267"} err="failed to get container status \"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267\": rpc error: code = NotFound desc = could not find container \"186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267\": container with ID starting with 186af2ccf5d495af8b20141f0d7dc9be38bb981ee3593a88bb550179bba04267 not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.272629 4974 scope.go:117] "RemoveContainer" containerID="65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.273196 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3"} err="failed to get container status \"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3\": rpc error: code = NotFound desc = could not find container \"65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3\": container with ID starting with 65fea14ac8f02af401b2c9f9a44a697c2d3327da9bfd68e1855d5829a7d4aad3 not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.273214 4974 scope.go:117] "RemoveContainer" containerID="e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.273596 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a"} err="failed to get container status \"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a\": rpc error: code = NotFound desc = could not find container \"e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a\": container with ID starting with e4fe5c5f665356bdcefcf7815531333fd0bb27d5d5220597d5fc6804aaccc78a not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.273612 4974 scope.go:117] "RemoveContainer" containerID="6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.273883 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3"} err="failed to get container status \"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3\": rpc error: code = NotFound desc = could not find container \"6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3\": container with ID starting with 6e4209b3e21bfd0d8574f591577e86c6b0f3353c97b57637f6342df934fdc9f3 not found: ID does not exist" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.308008 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.322753 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335217 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:58 crc kubenswrapper[4974]: E1013 18:32:58.335585 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27f6985f-10d9-40de-8441-8289ed83515c" containerName="mariadb-database-create" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335611 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="27f6985f-10d9-40de-8441-8289ed83515c" containerName="mariadb-database-create" Oct 13 18:32:58 crc kubenswrapper[4974]: E1013 18:32:58.335624 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="ceilometer-central-agent" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335630 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="ceilometer-central-agent" Oct 13 18:32:58 crc kubenswrapper[4974]: E1013 18:32:58.335643 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2faa46ff-30c6-4bc7-9b04-69088774b56d" containerName="mariadb-database-create" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335665 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="2faa46ff-30c6-4bc7-9b04-69088774b56d" containerName="mariadb-database-create" Oct 13 18:32:58 crc kubenswrapper[4974]: E1013 18:32:58.335677 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="proxy-httpd" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335682 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="proxy-httpd" Oct 13 18:32:58 crc kubenswrapper[4974]: E1013 18:32:58.335700 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a86a831-6077-4935-b8ce-f48755ebe658" containerName="mariadb-database-create" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335707 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a86a831-6077-4935-b8ce-f48755ebe658" containerName="mariadb-database-create" Oct 13 18:32:58 crc kubenswrapper[4974]: E1013 18:32:58.335720 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="ceilometer-notification-agent" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335726 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="ceilometer-notification-agent" Oct 13 18:32:58 crc kubenswrapper[4974]: E1013 18:32:58.335739 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="sg-core" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335747 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="sg-core" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335920 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="proxy-httpd" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335933 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="sg-core" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335950 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="ceilometer-notification-agent" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335960 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a86a831-6077-4935-b8ce-f48755ebe658" containerName="mariadb-database-create" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335973 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="27f6985f-10d9-40de-8441-8289ed83515c" containerName="mariadb-database-create" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335987 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="2faa46ff-30c6-4bc7-9b04-69088774b56d" containerName="mariadb-database-create" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.335998 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" containerName="ceilometer-central-agent" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.337785 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.340704 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.340900 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.345368 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.416771 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.416830 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwmlb\" (UniqueName: \"kubernetes.io/projected/4083e410-d4be-4264-b925-f2d3d636c0c4-kube-api-access-cwmlb\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.416850 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.416889 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.416856 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083e410-d4be-4264-b925-f2d3d636c0c4-log-httpd\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.417121 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-config-data\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.417401 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083e410-d4be-4264-b925-f2d3d636c0c4-run-httpd\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.417497 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-scripts\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.417544 4974 scope.go:117] "RemoveContainer" containerID="98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.417696 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.494242 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:32:58 crc kubenswrapper[4974]: E1013 18:32:58.496440 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-cwmlb log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="4083e410-d4be-4264-b925-f2d3d636c0c4" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.519326 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083e410-d4be-4264-b925-f2d3d636c0c4-run-httpd\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.519884 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-scripts\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.519996 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.520124 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.520223 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwmlb\" (UniqueName: \"kubernetes.io/projected/4083e410-d4be-4264-b925-f2d3d636c0c4-kube-api-access-cwmlb\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.520312 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083e410-d4be-4264-b925-f2d3d636c0c4-log-httpd\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.520411 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-config-data\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.519752 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083e410-d4be-4264-b925-f2d3d636c0c4-run-httpd\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.521335 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083e410-d4be-4264-b925-f2d3d636c0c4-log-httpd\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.525073 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-config-data\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.525741 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.529165 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.529883 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-scripts\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.538690 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwmlb\" (UniqueName: \"kubernetes.io/projected/4083e410-d4be-4264-b925-f2d3d636c0c4-kube-api-access-cwmlb\") pod \"ceilometer-0\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.973037 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"89790087-1d9c-4278-b62f-e18a94775048","Type":"ContainerStarted","Data":"468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65"} Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.974723 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:58 crc kubenswrapper[4974]: I1013 18:32:58.985582 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.027746 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-sg-core-conf-yaml\") pod \"4083e410-d4be-4264-b925-f2d3d636c0c4\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.027823 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwmlb\" (UniqueName: \"kubernetes.io/projected/4083e410-d4be-4264-b925-f2d3d636c0c4-kube-api-access-cwmlb\") pod \"4083e410-d4be-4264-b925-f2d3d636c0c4\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.027850 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083e410-d4be-4264-b925-f2d3d636c0c4-run-httpd\") pod \"4083e410-d4be-4264-b925-f2d3d636c0c4\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.027878 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-scripts\") pod \"4083e410-d4be-4264-b925-f2d3d636c0c4\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.027917 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-combined-ca-bundle\") pod \"4083e410-d4be-4264-b925-f2d3d636c0c4\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.027944 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-config-data\") pod \"4083e410-d4be-4264-b925-f2d3d636c0c4\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.027992 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083e410-d4be-4264-b925-f2d3d636c0c4-log-httpd\") pod \"4083e410-d4be-4264-b925-f2d3d636c0c4\" (UID: \"4083e410-d4be-4264-b925-f2d3d636c0c4\") " Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.030580 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4083e410-d4be-4264-b925-f2d3d636c0c4-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4083e410-d4be-4264-b925-f2d3d636c0c4" (UID: "4083e410-d4be-4264-b925-f2d3d636c0c4"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.030808 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4083e410-d4be-4264-b925-f2d3d636c0c4-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4083e410-d4be-4264-b925-f2d3d636c0c4" (UID: "4083e410-d4be-4264-b925-f2d3d636c0c4"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.032926 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4083e410-d4be-4264-b925-f2d3d636c0c4" (UID: "4083e410-d4be-4264-b925-f2d3d636c0c4"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.034324 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-config-data" (OuterVolumeSpecName: "config-data") pod "4083e410-d4be-4264-b925-f2d3d636c0c4" (UID: "4083e410-d4be-4264-b925-f2d3d636c0c4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.034753 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4083e410-d4be-4264-b925-f2d3d636c0c4-kube-api-access-cwmlb" (OuterVolumeSpecName: "kube-api-access-cwmlb") pod "4083e410-d4be-4264-b925-f2d3d636c0c4" (UID: "4083e410-d4be-4264-b925-f2d3d636c0c4"). InnerVolumeSpecName "kube-api-access-cwmlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.036725 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-scripts" (OuterVolumeSpecName: "scripts") pod "4083e410-d4be-4264-b925-f2d3d636c0c4" (UID: "4083e410-d4be-4264-b925-f2d3d636c0c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.036770 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4083e410-d4be-4264-b925-f2d3d636c0c4" (UID: "4083e410-d4be-4264-b925-f2d3d636c0c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.130531 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.130568 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.130581 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.130589 4974 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083e410-d4be-4264-b925-f2d3d636c0c4-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.130599 4974 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4083e410-d4be-4264-b925-f2d3d636c0c4-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.130607 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwmlb\" (UniqueName: \"kubernetes.io/projected/4083e410-d4be-4264-b925-f2d3d636c0c4-kube-api-access-cwmlb\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.130616 4974 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4083e410-d4be-4264-b925-f2d3d636c0c4-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.822340 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f1adfb3-3344-4215-a02c-ecfc250687da" path="/var/lib/kubelet/pods/6f1adfb3-3344-4215-a02c-ecfc250687da/volumes" Oct 13 18:32:59 crc kubenswrapper[4974]: I1013 18:32:59.994234 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.051712 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.062241 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.072203 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.074496 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.076447 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.077188 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.083490 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.149436 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.149507 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f95a4e-d8a2-4655-9eac-51a72744bea2-log-httpd\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.149537 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6dxw\" (UniqueName: \"kubernetes.io/projected/63f95a4e-d8a2-4655-9eac-51a72744bea2-kube-api-access-n6dxw\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.149560 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.149578 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f95a4e-d8a2-4655-9eac-51a72744bea2-run-httpd\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.149607 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-config-data\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.149717 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-scripts\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.220085 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-30c2-account-create-grnv8"] Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.221479 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30c2-account-create-grnv8" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.223610 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.240995 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-30c2-account-create-grnv8"] Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.251141 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftdvr\" (UniqueName: \"kubernetes.io/projected/b287482e-c536-4a68-8c64-3e8fbfbc8c4c-kube-api-access-ftdvr\") pod \"nova-api-30c2-account-create-grnv8\" (UID: \"b287482e-c536-4a68-8c64-3e8fbfbc8c4c\") " pod="openstack/nova-api-30c2-account-create-grnv8" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.251201 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f95a4e-d8a2-4655-9eac-51a72744bea2-log-httpd\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.251256 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6dxw\" (UniqueName: \"kubernetes.io/projected/63f95a4e-d8a2-4655-9eac-51a72744bea2-kube-api-access-n6dxw\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.251288 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.251315 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f95a4e-d8a2-4655-9eac-51a72744bea2-run-httpd\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.251343 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-config-data\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.251575 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f95a4e-d8a2-4655-9eac-51a72744bea2-log-httpd\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.251697 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-scripts\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.251740 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.251744 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f95a4e-d8a2-4655-9eac-51a72744bea2-run-httpd\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.257135 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-scripts\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.257200 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.258893 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-config-data\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.262244 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.268154 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6dxw\" (UniqueName: \"kubernetes.io/projected/63f95a4e-d8a2-4655-9eac-51a72744bea2-kube-api-access-n6dxw\") pod \"ceilometer-0\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.353407 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftdvr\" (UniqueName: \"kubernetes.io/projected/b287482e-c536-4a68-8c64-3e8fbfbc8c4c-kube-api-access-ftdvr\") pod \"nova-api-30c2-account-create-grnv8\" (UID: \"b287482e-c536-4a68-8c64-3e8fbfbc8c4c\") " pod="openstack/nova-api-30c2-account-create-grnv8" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.372846 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftdvr\" (UniqueName: \"kubernetes.io/projected/b287482e-c536-4a68-8c64-3e8fbfbc8c4c-kube-api-access-ftdvr\") pod \"nova-api-30c2-account-create-grnv8\" (UID: \"b287482e-c536-4a68-8c64-3e8fbfbc8c4c\") " pod="openstack/nova-api-30c2-account-create-grnv8" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.398739 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.466572 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1ad5-account-create-9j2hm"] Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.468214 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1ad5-account-create-9j2hm" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.470280 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.476412 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1ad5-account-create-9j2hm"] Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.550619 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30c2-account-create-grnv8" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.557543 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x8zf\" (UniqueName: \"kubernetes.io/projected/e8a7c093-f85f-4362-a73f-fea72dd2833b-kube-api-access-4x8zf\") pod \"nova-cell0-1ad5-account-create-9j2hm\" (UID: \"e8a7c093-f85f-4362-a73f-fea72dd2833b\") " pod="openstack/nova-cell0-1ad5-account-create-9j2hm" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.659716 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x8zf\" (UniqueName: \"kubernetes.io/projected/e8a7c093-f85f-4362-a73f-fea72dd2833b-kube-api-access-4x8zf\") pod \"nova-cell0-1ad5-account-create-9j2hm\" (UID: \"e8a7c093-f85f-4362-a73f-fea72dd2833b\") " pod="openstack/nova-cell0-1ad5-account-create-9j2hm" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.669506 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e1d3-account-create-jzpp5"] Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.670914 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e1d3-account-create-jzpp5" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.677268 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.681644 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e1d3-account-create-jzpp5"] Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.684964 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x8zf\" (UniqueName: \"kubernetes.io/projected/e8a7c093-f85f-4362-a73f-fea72dd2833b-kube-api-access-4x8zf\") pod \"nova-cell0-1ad5-account-create-9j2hm\" (UID: \"e8a7c093-f85f-4362-a73f-fea72dd2833b\") " pod="openstack/nova-cell0-1ad5-account-create-9j2hm" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.762325 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcqb4\" (UniqueName: \"kubernetes.io/projected/c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7-kube-api-access-jcqb4\") pod \"nova-cell1-e1d3-account-create-jzpp5\" (UID: \"c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7\") " pod="openstack/nova-cell1-e1d3-account-create-jzpp5" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.864114 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcqb4\" (UniqueName: \"kubernetes.io/projected/c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7-kube-api-access-jcqb4\") pod \"nova-cell1-e1d3-account-create-jzpp5\" (UID: \"c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7\") " pod="openstack/nova-cell1-e1d3-account-create-jzpp5" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.866777 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1ad5-account-create-9j2hm" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.894384 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcqb4\" (UniqueName: \"kubernetes.io/projected/c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7-kube-api-access-jcqb4\") pod \"nova-cell1-e1d3-account-create-jzpp5\" (UID: \"c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7\") " pod="openstack/nova-cell1-e1d3-account-create-jzpp5" Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.933320 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:33:00 crc kubenswrapper[4974]: W1013 18:33:00.938192 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63f95a4e_d8a2_4655_9eac_51a72744bea2.slice/crio-db61c6bbea4c012801790bc1d72fc7a01e448f59aea43d95ea8afb6c555d2a5a WatchSource:0}: Error finding container db61c6bbea4c012801790bc1d72fc7a01e448f59aea43d95ea8afb6c555d2a5a: Status 404 returned error can't find the container with id db61c6bbea4c012801790bc1d72fc7a01e448f59aea43d95ea8afb6c555d2a5a Oct 13 18:33:00 crc kubenswrapper[4974]: I1013 18:33:00.994057 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e1d3-account-create-jzpp5" Oct 13 18:33:01 crc kubenswrapper[4974]: I1013 18:33:01.025035 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f95a4e-d8a2-4655-9eac-51a72744bea2","Type":"ContainerStarted","Data":"db61c6bbea4c012801790bc1d72fc7a01e448f59aea43d95ea8afb6c555d2a5a"} Oct 13 18:33:01 crc kubenswrapper[4974]: I1013 18:33:01.074886 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-30c2-account-create-grnv8"] Oct 13 18:33:01 crc kubenswrapper[4974]: I1013 18:33:01.320048 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1ad5-account-create-9j2hm"] Oct 13 18:33:01 crc kubenswrapper[4974]: I1013 18:33:01.453567 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e1d3-account-create-jzpp5"] Oct 13 18:33:01 crc kubenswrapper[4974]: W1013 18:33:01.458143 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc7bc2aa8_96b5_412b_9dd8_2263e8a8e5a7.slice/crio-8c226f3cc4ea3cd30cb80f7f351f8e61663a59d0c777370db661b4ecd04095aa WatchSource:0}: Error finding container 8c226f3cc4ea3cd30cb80f7f351f8e61663a59d0c777370db661b4ecd04095aa: Status 404 returned error can't find the container with id 8c226f3cc4ea3cd30cb80f7f351f8e61663a59d0c777370db661b4ecd04095aa Oct 13 18:33:01 crc kubenswrapper[4974]: I1013 18:33:01.825373 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4083e410-d4be-4264-b925-f2d3d636c0c4" path="/var/lib/kubelet/pods/4083e410-d4be-4264-b925-f2d3d636c0c4/volumes" Oct 13 18:33:02 crc kubenswrapper[4974]: I1013 18:33:02.034678 4974 generic.go:334] "Generic (PLEG): container finished" podID="e8a7c093-f85f-4362-a73f-fea72dd2833b" containerID="d5371d8e229af4bda4cdf80ec9b0eeea6e560bb60839e1b1a87980753f9b7616" exitCode=0 Oct 13 18:33:02 crc kubenswrapper[4974]: I1013 18:33:02.034729 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1ad5-account-create-9j2hm" event={"ID":"e8a7c093-f85f-4362-a73f-fea72dd2833b","Type":"ContainerDied","Data":"d5371d8e229af4bda4cdf80ec9b0eeea6e560bb60839e1b1a87980753f9b7616"} Oct 13 18:33:02 crc kubenswrapper[4974]: I1013 18:33:02.034756 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1ad5-account-create-9j2hm" event={"ID":"e8a7c093-f85f-4362-a73f-fea72dd2833b","Type":"ContainerStarted","Data":"9de0ee2e1aa8666462546c30d9d0f6bef81dcf9693825b6c79eeac9501e3efbb"} Oct 13 18:33:02 crc kubenswrapper[4974]: I1013 18:33:02.036188 4974 generic.go:334] "Generic (PLEG): container finished" podID="b287482e-c536-4a68-8c64-3e8fbfbc8c4c" containerID="4bcdf0aa9d90318d4969b3813dd38297746fbb096340728f8eac758dfac0041b" exitCode=0 Oct 13 18:33:02 crc kubenswrapper[4974]: I1013 18:33:02.036222 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30c2-account-create-grnv8" event={"ID":"b287482e-c536-4a68-8c64-3e8fbfbc8c4c","Type":"ContainerDied","Data":"4bcdf0aa9d90318d4969b3813dd38297746fbb096340728f8eac758dfac0041b"} Oct 13 18:33:02 crc kubenswrapper[4974]: I1013 18:33:02.036237 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30c2-account-create-grnv8" event={"ID":"b287482e-c536-4a68-8c64-3e8fbfbc8c4c","Type":"ContainerStarted","Data":"8e80a99ea1942cd1346a88e719cb30c338029cde85592928dd0db107d96b15c3"} Oct 13 18:33:02 crc kubenswrapper[4974]: I1013 18:33:02.037597 4974 generic.go:334] "Generic (PLEG): container finished" podID="c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7" containerID="eebfad1b1cce0c4fbbd799ff15fad8e6d401b0a58a14d6064ffa656f94b4b983" exitCode=0 Oct 13 18:33:02 crc kubenswrapper[4974]: I1013 18:33:02.037634 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e1d3-account-create-jzpp5" event={"ID":"c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7","Type":"ContainerDied","Data":"eebfad1b1cce0c4fbbd799ff15fad8e6d401b0a58a14d6064ffa656f94b4b983"} Oct 13 18:33:02 crc kubenswrapper[4974]: I1013 18:33:02.037661 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e1d3-account-create-jzpp5" event={"ID":"c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7","Type":"ContainerStarted","Data":"8c226f3cc4ea3cd30cb80f7f351f8e61663a59d0c777370db661b4ecd04095aa"} Oct 13 18:33:02 crc kubenswrapper[4974]: I1013 18:33:02.039600 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f95a4e-d8a2-4655-9eac-51a72744bea2","Type":"ContainerStarted","Data":"54a4221f2a11ad2afdf813bedef42b1a37f5bb09a3b14b466c424fb30125d9d9"} Oct 13 18:33:02 crc kubenswrapper[4974]: I1013 18:33:02.039626 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f95a4e-d8a2-4655-9eac-51a72744bea2","Type":"ContainerStarted","Data":"d71191c00513f70a3d222080896d9471279acce934557e02b574ba747cbe439b"} Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.057421 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f95a4e-d8a2-4655-9eac-51a72744bea2","Type":"ContainerStarted","Data":"ddd156e87b0bf8444497f094801428bc202cf3441af5fb5b22c1eed63698124f"} Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.455731 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1ad5-account-create-9j2hm" Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.511907 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x8zf\" (UniqueName: \"kubernetes.io/projected/e8a7c093-f85f-4362-a73f-fea72dd2833b-kube-api-access-4x8zf\") pod \"e8a7c093-f85f-4362-a73f-fea72dd2833b\" (UID: \"e8a7c093-f85f-4362-a73f-fea72dd2833b\") " Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.523618 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a7c093-f85f-4362-a73f-fea72dd2833b-kube-api-access-4x8zf" (OuterVolumeSpecName: "kube-api-access-4x8zf") pod "e8a7c093-f85f-4362-a73f-fea72dd2833b" (UID: "e8a7c093-f85f-4362-a73f-fea72dd2833b"). InnerVolumeSpecName "kube-api-access-4x8zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.614952 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x8zf\" (UniqueName: \"kubernetes.io/projected/e8a7c093-f85f-4362-a73f-fea72dd2833b-kube-api-access-4x8zf\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.794922 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e1d3-account-create-jzpp5" Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.801873 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30c2-account-create-grnv8" Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.826595 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcqb4\" (UniqueName: \"kubernetes.io/projected/c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7-kube-api-access-jcqb4\") pod \"c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7\" (UID: \"c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7\") " Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.826705 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftdvr\" (UniqueName: \"kubernetes.io/projected/b287482e-c536-4a68-8c64-3e8fbfbc8c4c-kube-api-access-ftdvr\") pod \"b287482e-c536-4a68-8c64-3e8fbfbc8c4c\" (UID: \"b287482e-c536-4a68-8c64-3e8fbfbc8c4c\") " Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.855635 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b287482e-c536-4a68-8c64-3e8fbfbc8c4c-kube-api-access-ftdvr" (OuterVolumeSpecName: "kube-api-access-ftdvr") pod "b287482e-c536-4a68-8c64-3e8fbfbc8c4c" (UID: "b287482e-c536-4a68-8c64-3e8fbfbc8c4c"). InnerVolumeSpecName "kube-api-access-ftdvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.856031 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7-kube-api-access-jcqb4" (OuterVolumeSpecName: "kube-api-access-jcqb4") pod "c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7" (UID: "c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7"). InnerVolumeSpecName "kube-api-access-jcqb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.929301 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcqb4\" (UniqueName: \"kubernetes.io/projected/c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7-kube-api-access-jcqb4\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:03 crc kubenswrapper[4974]: I1013 18:33:03.929336 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftdvr\" (UniqueName: \"kubernetes.io/projected/b287482e-c536-4a68-8c64-3e8fbfbc8c4c-kube-api-access-ftdvr\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:04 crc kubenswrapper[4974]: I1013 18:33:04.065216 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-30c2-account-create-grnv8" event={"ID":"b287482e-c536-4a68-8c64-3e8fbfbc8c4c","Type":"ContainerDied","Data":"8e80a99ea1942cd1346a88e719cb30c338029cde85592928dd0db107d96b15c3"} Oct 13 18:33:04 crc kubenswrapper[4974]: I1013 18:33:04.065250 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e80a99ea1942cd1346a88e719cb30c338029cde85592928dd0db107d96b15c3" Oct 13 18:33:04 crc kubenswrapper[4974]: I1013 18:33:04.065258 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-30c2-account-create-grnv8" Oct 13 18:33:04 crc kubenswrapper[4974]: I1013 18:33:04.066149 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e1d3-account-create-jzpp5" event={"ID":"c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7","Type":"ContainerDied","Data":"8c226f3cc4ea3cd30cb80f7f351f8e61663a59d0c777370db661b4ecd04095aa"} Oct 13 18:33:04 crc kubenswrapper[4974]: I1013 18:33:04.066182 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c226f3cc4ea3cd30cb80f7f351f8e61663a59d0c777370db661b4ecd04095aa" Oct 13 18:33:04 crc kubenswrapper[4974]: I1013 18:33:04.066244 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e1d3-account-create-jzpp5" Oct 13 18:33:04 crc kubenswrapper[4974]: I1013 18:33:04.068184 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f95a4e-d8a2-4655-9eac-51a72744bea2","Type":"ContainerStarted","Data":"018bcb54c1ffe93283f56836c36e63a9b336751b049f0148be56fad32fea2a47"} Oct 13 18:33:04 crc kubenswrapper[4974]: I1013 18:33:04.068254 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 18:33:04 crc kubenswrapper[4974]: I1013 18:33:04.069374 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1ad5-account-create-9j2hm" event={"ID":"e8a7c093-f85f-4362-a73f-fea72dd2833b","Type":"ContainerDied","Data":"9de0ee2e1aa8666462546c30d9d0f6bef81dcf9693825b6c79eeac9501e3efbb"} Oct 13 18:33:04 crc kubenswrapper[4974]: I1013 18:33:04.069395 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1ad5-account-create-9j2hm" Oct 13 18:33:04 crc kubenswrapper[4974]: I1013 18:33:04.069400 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9de0ee2e1aa8666462546c30d9d0f6bef81dcf9693825b6c79eeac9501e3efbb" Oct 13 18:33:04 crc kubenswrapper[4974]: I1013 18:33:04.086745 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.368279528 podStartE2EDuration="4.086726242s" podCreationTimestamp="2025-10-13 18:33:00 +0000 UTC" firstStartedPulling="2025-10-13 18:33:00.940101648 +0000 UTC m=+1115.844467728" lastFinishedPulling="2025-10-13 18:33:03.658548362 +0000 UTC m=+1118.562914442" observedRunningTime="2025-10-13 18:33:04.08561219 +0000 UTC m=+1118.989978290" watchObservedRunningTime="2025-10-13 18:33:04.086726242 +0000 UTC m=+1118.991092322" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.651462 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sk544"] Oct 13 18:33:05 crc kubenswrapper[4974]: E1013 18:33:05.653303 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7" containerName="mariadb-account-create" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.653374 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7" containerName="mariadb-account-create" Oct 13 18:33:05 crc kubenswrapper[4974]: E1013 18:33:05.653468 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b287482e-c536-4a68-8c64-3e8fbfbc8c4c" containerName="mariadb-account-create" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.653531 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b287482e-c536-4a68-8c64-3e8fbfbc8c4c" containerName="mariadb-account-create" Oct 13 18:33:05 crc kubenswrapper[4974]: E1013 18:33:05.653609 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a7c093-f85f-4362-a73f-fea72dd2833b" containerName="mariadb-account-create" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.653691 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a7c093-f85f-4362-a73f-fea72dd2833b" containerName="mariadb-account-create" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.653939 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a7c093-f85f-4362-a73f-fea72dd2833b" containerName="mariadb-account-create" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.654049 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7" containerName="mariadb-account-create" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.654110 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="b287482e-c536-4a68-8c64-3e8fbfbc8c4c" containerName="mariadb-account-create" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.654785 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.656835 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.657346 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.660130 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-knrjt" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.664345 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sk544"] Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.764249 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-config-data\") pod \"nova-cell0-conductor-db-sync-sk544\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.764362 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-sk544\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.764429 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d45cb\" (UniqueName: \"kubernetes.io/projected/b7d74baa-5dd0-454d-af5d-474c29d83d21-kube-api-access-d45cb\") pod \"nova-cell0-conductor-db-sync-sk544\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.764737 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-scripts\") pod \"nova-cell0-conductor-db-sync-sk544\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.866693 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-scripts\") pod \"nova-cell0-conductor-db-sync-sk544\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.867085 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-config-data\") pod \"nova-cell0-conductor-db-sync-sk544\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.867220 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-sk544\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.867354 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d45cb\" (UniqueName: \"kubernetes.io/projected/b7d74baa-5dd0-454d-af5d-474c29d83d21-kube-api-access-d45cb\") pod \"nova-cell0-conductor-db-sync-sk544\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.877610 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-sk544\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.877704 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-config-data\") pod \"nova-cell0-conductor-db-sync-sk544\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.881231 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-scripts\") pod \"nova-cell0-conductor-db-sync-sk544\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.894178 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d45cb\" (UniqueName: \"kubernetes.io/projected/b7d74baa-5dd0-454d-af5d-474c29d83d21-kube-api-access-d45cb\") pod \"nova-cell0-conductor-db-sync-sk544\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:05 crc kubenswrapper[4974]: I1013 18:33:05.980760 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:06 crc kubenswrapper[4974]: I1013 18:33:06.581421 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sk544"] Oct 13 18:33:07 crc kubenswrapper[4974]: I1013 18:33:07.101257 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-sk544" event={"ID":"b7d74baa-5dd0-454d-af5d-474c29d83d21","Type":"ContainerStarted","Data":"3368654fe20ef61bca8c77b5493289a93c10adb5f7c0c5f1443a6a41ebd67a45"} Oct 13 18:33:08 crc kubenswrapper[4974]: I1013 18:33:08.419701 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 13 18:33:08 crc kubenswrapper[4974]: I1013 18:33:08.460455 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 13 18:33:09 crc kubenswrapper[4974]: I1013 18:33:09.126107 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 13 18:33:09 crc kubenswrapper[4974]: I1013 18:33:09.184019 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 13 18:33:09 crc kubenswrapper[4974]: I1013 18:33:09.236831 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:33:11 crc kubenswrapper[4974]: I1013 18:33:11.150599 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" containerID="cri-o://468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65" gracePeriod=30 Oct 13 18:33:14 crc kubenswrapper[4974]: I1013 18:33:14.190164 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-sk544" event={"ID":"b7d74baa-5dd0-454d-af5d-474c29d83d21","Type":"ContainerStarted","Data":"3e147a6637689712582d2edf869325765299ddbf20cbf452eb204f6b399e2503"} Oct 13 18:33:14 crc kubenswrapper[4974]: I1013 18:33:14.213767 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-sk544" podStartSLOduration=2.105163705 podStartE2EDuration="9.213749727s" podCreationTimestamp="2025-10-13 18:33:05 +0000 UTC" firstStartedPulling="2025-10-13 18:33:06.579572322 +0000 UTC m=+1121.483938402" lastFinishedPulling="2025-10-13 18:33:13.688158344 +0000 UTC m=+1128.592524424" observedRunningTime="2025-10-13 18:33:14.206809321 +0000 UTC m=+1129.111175431" watchObservedRunningTime="2025-10-13 18:33:14.213749727 +0000 UTC m=+1129.118115807" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.063287 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.182768 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89790087-1d9c-4278-b62f-e18a94775048-logs\") pod \"89790087-1d9c-4278-b62f-e18a94775048\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.182832 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-combined-ca-bundle\") pod \"89790087-1d9c-4278-b62f-e18a94775048\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.183022 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-config-data\") pod \"89790087-1d9c-4278-b62f-e18a94775048\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.183054 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-custom-prometheus-ca\") pod \"89790087-1d9c-4278-b62f-e18a94775048\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.183076 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6cz9\" (UniqueName: \"kubernetes.io/projected/89790087-1d9c-4278-b62f-e18a94775048-kube-api-access-c6cz9\") pod \"89790087-1d9c-4278-b62f-e18a94775048\" (UID: \"89790087-1d9c-4278-b62f-e18a94775048\") " Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.189540 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89790087-1d9c-4278-b62f-e18a94775048-logs" (OuterVolumeSpecName: "logs") pod "89790087-1d9c-4278-b62f-e18a94775048" (UID: "89790087-1d9c-4278-b62f-e18a94775048"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.202107 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89790087-1d9c-4278-b62f-e18a94775048-kube-api-access-c6cz9" (OuterVolumeSpecName: "kube-api-access-c6cz9") pod "89790087-1d9c-4278-b62f-e18a94775048" (UID: "89790087-1d9c-4278-b62f-e18a94775048"). InnerVolumeSpecName "kube-api-access-c6cz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.232765 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "89790087-1d9c-4278-b62f-e18a94775048" (UID: "89790087-1d9c-4278-b62f-e18a94775048"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.236309 4974 generic.go:334] "Generic (PLEG): container finished" podID="89790087-1d9c-4278-b62f-e18a94775048" containerID="468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65" exitCode=0 Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.236344 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"89790087-1d9c-4278-b62f-e18a94775048","Type":"ContainerDied","Data":"468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65"} Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.236369 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"89790087-1d9c-4278-b62f-e18a94775048","Type":"ContainerDied","Data":"3af3f161b32a6514d44b876c084d404f4da39e7ace812ecf8a18cd4aa4700289"} Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.236383 4974 scope.go:117] "RemoveContainer" containerID="468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.236493 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.276848 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89790087-1d9c-4278-b62f-e18a94775048" (UID: "89790087-1d9c-4278-b62f-e18a94775048"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.288901 4974 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.288935 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6cz9\" (UniqueName: \"kubernetes.io/projected/89790087-1d9c-4278-b62f-e18a94775048-kube-api-access-c6cz9\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.288944 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/89790087-1d9c-4278-b62f-e18a94775048-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.288954 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.309701 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-config-data" (OuterVolumeSpecName: "config-data") pod "89790087-1d9c-4278-b62f-e18a94775048" (UID: "89790087-1d9c-4278-b62f-e18a94775048"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.336254 4974 scope.go:117] "RemoveContainer" containerID="98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.379598 4974 scope.go:117] "RemoveContainer" containerID="468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65" Oct 13 18:33:16 crc kubenswrapper[4974]: E1013 18:33:16.380087 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65\": container with ID starting with 468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65 not found: ID does not exist" containerID="468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.380133 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65"} err="failed to get container status \"468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65\": rpc error: code = NotFound desc = could not find container \"468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65\": container with ID starting with 468c0d0bf68a7d6d140af7950147573fedcd4128e475a571849bed12696aeb65 not found: ID does not exist" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.380160 4974 scope.go:117] "RemoveContainer" containerID="98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d" Oct 13 18:33:16 crc kubenswrapper[4974]: E1013 18:33:16.380618 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d\": container with ID starting with 98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d not found: ID does not exist" containerID="98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.380667 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d"} err="failed to get container status \"98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d\": rpc error: code = NotFound desc = could not find container \"98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d\": container with ID starting with 98884cc9f593ab08116e0f91bfd97636319c63b668b07954671095553215998d not found: ID does not exist" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.391114 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89790087-1d9c-4278-b62f-e18a94775048-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.565161 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.572814 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.587744 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:33:16 crc kubenswrapper[4974]: E1013 18:33:16.588092 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.588110 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" Oct 13 18:33:16 crc kubenswrapper[4974]: E1013 18:33:16.588125 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.588132 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.588330 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.588351 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.588361 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.588960 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.591569 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.599761 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.696635 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.696775 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xg52\" (UniqueName: \"kubernetes.io/projected/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-kube-api-access-7xg52\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.696844 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-config-data\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.696885 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.696920 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-logs\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.798340 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xg52\" (UniqueName: \"kubernetes.io/projected/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-kube-api-access-7xg52\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.798446 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-config-data\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.798484 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.798515 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-logs\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.798554 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.799479 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-logs\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.803241 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.804187 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.811328 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-config-data\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.814718 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xg52\" (UniqueName: \"kubernetes.io/projected/b6ea71d0-a795-4a73-9108-dc8e4a3e4187-kube-api-access-7xg52\") pod \"watcher-decision-engine-0\" (UID: \"b6ea71d0-a795-4a73-9108-dc8e4a3e4187\") " pod="openstack/watcher-decision-engine-0" Oct 13 18:33:16 crc kubenswrapper[4974]: I1013 18:33:16.905308 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Oct 13 18:33:17 crc kubenswrapper[4974]: I1013 18:33:17.413104 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Oct 13 18:33:17 crc kubenswrapper[4974]: W1013 18:33:17.415204 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6ea71d0_a795_4a73_9108_dc8e4a3e4187.slice/crio-1f6cc5db0b1900005d32be259f60a4e097f8db70d92822f065f8109240273cb2 WatchSource:0}: Error finding container 1f6cc5db0b1900005d32be259f60a4e097f8db70d92822f065f8109240273cb2: Status 404 returned error can't find the container with id 1f6cc5db0b1900005d32be259f60a4e097f8db70d92822f065f8109240273cb2 Oct 13 18:33:17 crc kubenswrapper[4974]: I1013 18:33:17.821182 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89790087-1d9c-4278-b62f-e18a94775048" path="/var/lib/kubelet/pods/89790087-1d9c-4278-b62f-e18a94775048/volumes" Oct 13 18:33:18 crc kubenswrapper[4974]: I1013 18:33:18.260195 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b6ea71d0-a795-4a73-9108-dc8e4a3e4187","Type":"ContainerStarted","Data":"859f2d543f9ac7361c42458183d3b80c499c23618c02f3c3fec9354b7cfc5872"} Oct 13 18:33:18 crc kubenswrapper[4974]: I1013 18:33:18.260621 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b6ea71d0-a795-4a73-9108-dc8e4a3e4187","Type":"ContainerStarted","Data":"1f6cc5db0b1900005d32be259f60a4e097f8db70d92822f065f8109240273cb2"} Oct 13 18:33:18 crc kubenswrapper[4974]: I1013 18:33:18.289550 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=2.28952413 podStartE2EDuration="2.28952413s" podCreationTimestamp="2025-10-13 18:33:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:33:18.280074294 +0000 UTC m=+1133.184440404" watchObservedRunningTime="2025-10-13 18:33:18.28952413 +0000 UTC m=+1133.193890250" Oct 13 18:33:26 crc kubenswrapper[4974]: I1013 18:33:26.906187 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Oct 13 18:33:26 crc kubenswrapper[4974]: I1013 18:33:26.934360 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Oct 13 18:33:27 crc kubenswrapper[4974]: I1013 18:33:27.358589 4974 generic.go:334] "Generic (PLEG): container finished" podID="b7d74baa-5dd0-454d-af5d-474c29d83d21" containerID="3e147a6637689712582d2edf869325765299ddbf20cbf452eb204f6b399e2503" exitCode=0 Oct 13 18:33:27 crc kubenswrapper[4974]: I1013 18:33:27.358673 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-sk544" event={"ID":"b7d74baa-5dd0-454d-af5d-474c29d83d21","Type":"ContainerDied","Data":"3e147a6637689712582d2edf869325765299ddbf20cbf452eb204f6b399e2503"} Oct 13 18:33:27 crc kubenswrapper[4974]: I1013 18:33:27.358920 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Oct 13 18:33:27 crc kubenswrapper[4974]: I1013 18:33:27.400449 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.718567 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.759023 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-scripts\") pod \"b7d74baa-5dd0-454d-af5d-474c29d83d21\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.759102 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-config-data\") pod \"b7d74baa-5dd0-454d-af5d-474c29d83d21\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.759135 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-combined-ca-bundle\") pod \"b7d74baa-5dd0-454d-af5d-474c29d83d21\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.759163 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d45cb\" (UniqueName: \"kubernetes.io/projected/b7d74baa-5dd0-454d-af5d-474c29d83d21-kube-api-access-d45cb\") pod \"b7d74baa-5dd0-454d-af5d-474c29d83d21\" (UID: \"b7d74baa-5dd0-454d-af5d-474c29d83d21\") " Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.764131 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-scripts" (OuterVolumeSpecName: "scripts") pod "b7d74baa-5dd0-454d-af5d-474c29d83d21" (UID: "b7d74baa-5dd0-454d-af5d-474c29d83d21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.764180 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7d74baa-5dd0-454d-af5d-474c29d83d21-kube-api-access-d45cb" (OuterVolumeSpecName: "kube-api-access-d45cb") pod "b7d74baa-5dd0-454d-af5d-474c29d83d21" (UID: "b7d74baa-5dd0-454d-af5d-474c29d83d21"). InnerVolumeSpecName "kube-api-access-d45cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.790245 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-config-data" (OuterVolumeSpecName: "config-data") pod "b7d74baa-5dd0-454d-af5d-474c29d83d21" (UID: "b7d74baa-5dd0-454d-af5d-474c29d83d21"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.796810 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7d74baa-5dd0-454d-af5d-474c29d83d21" (UID: "b7d74baa-5dd0-454d-af5d-474c29d83d21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.863458 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.863494 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.863510 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7d74baa-5dd0-454d-af5d-474c29d83d21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:28 crc kubenswrapper[4974]: I1013 18:33:28.863526 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d45cb\" (UniqueName: \"kubernetes.io/projected/b7d74baa-5dd0-454d-af5d-474c29d83d21-kube-api-access-d45cb\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.376838 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-sk544" event={"ID":"b7d74baa-5dd0-454d-af5d-474c29d83d21","Type":"ContainerDied","Data":"3368654fe20ef61bca8c77b5493289a93c10adb5f7c0c5f1443a6a41ebd67a45"} Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.376862 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-sk544" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.376883 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3368654fe20ef61bca8c77b5493289a93c10adb5f7c0c5f1443a6a41ebd67a45" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.470828 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 18:33:29 crc kubenswrapper[4974]: E1013 18:33:29.471364 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.471388 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" Oct 13 18:33:29 crc kubenswrapper[4974]: E1013 18:33:29.471410 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7d74baa-5dd0-454d-af5d-474c29d83d21" containerName="nova-cell0-conductor-db-sync" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.471419 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7d74baa-5dd0-454d-af5d-474c29d83d21" containerName="nova-cell0-conductor-db-sync" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.471642 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7d74baa-5dd0-454d-af5d-474c29d83d21" containerName="nova-cell0-conductor-db-sync" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.471691 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.472586 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.474982 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-knrjt" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.475591 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.480401 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.575450 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjfs2\" (UniqueName: \"kubernetes.io/projected/be594e70-2775-4c06-a266-b2fcaf428134-kube-api-access-xjfs2\") pod \"nova-cell0-conductor-0\" (UID: \"be594e70-2775-4c06-a266-b2fcaf428134\") " pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.575496 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be594e70-2775-4c06-a266-b2fcaf428134-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"be594e70-2775-4c06-a266-b2fcaf428134\") " pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.575929 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be594e70-2775-4c06-a266-b2fcaf428134-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"be594e70-2775-4c06-a266-b2fcaf428134\") " pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.678148 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be594e70-2775-4c06-a266-b2fcaf428134-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"be594e70-2775-4c06-a266-b2fcaf428134\") " pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.678816 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjfs2\" (UniqueName: \"kubernetes.io/projected/be594e70-2775-4c06-a266-b2fcaf428134-kube-api-access-xjfs2\") pod \"nova-cell0-conductor-0\" (UID: \"be594e70-2775-4c06-a266-b2fcaf428134\") " pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.678889 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be594e70-2775-4c06-a266-b2fcaf428134-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"be594e70-2775-4c06-a266-b2fcaf428134\") " pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.683448 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be594e70-2775-4c06-a266-b2fcaf428134-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"be594e70-2775-4c06-a266-b2fcaf428134\") " pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.685015 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be594e70-2775-4c06-a266-b2fcaf428134-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"be594e70-2775-4c06-a266-b2fcaf428134\") " pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.704047 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjfs2\" (UniqueName: \"kubernetes.io/projected/be594e70-2775-4c06-a266-b2fcaf428134-kube-api-access-xjfs2\") pod \"nova-cell0-conductor-0\" (UID: \"be594e70-2775-4c06-a266-b2fcaf428134\") " pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:29 crc kubenswrapper[4974]: I1013 18:33:29.803393 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:30 crc kubenswrapper[4974]: I1013 18:33:30.271225 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Oct 13 18:33:30 crc kubenswrapper[4974]: I1013 18:33:30.391538 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"be594e70-2775-4c06-a266-b2fcaf428134","Type":"ContainerStarted","Data":"2735c401e3fb437a98688ed9c1490df4effa2686de413ba2cc438b3018ad4801"} Oct 13 18:33:30 crc kubenswrapper[4974]: I1013 18:33:30.406408 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 13 18:33:31 crc kubenswrapper[4974]: I1013 18:33:31.407551 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"be594e70-2775-4c06-a266-b2fcaf428134","Type":"ContainerStarted","Data":"af2214da856327df28c9663897ccc2e8454aab95691410c3c8805d59aefa3e2c"} Oct 13 18:33:31 crc kubenswrapper[4974]: I1013 18:33:31.459860 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.459840737 podStartE2EDuration="2.459840737s" podCreationTimestamp="2025-10-13 18:33:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:33:31.425951563 +0000 UTC m=+1146.330317673" watchObservedRunningTime="2025-10-13 18:33:31.459840737 +0000 UTC m=+1146.364206817" Oct 13 18:33:32 crc kubenswrapper[4974]: I1013 18:33:32.471239 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:34 crc kubenswrapper[4974]: I1013 18:33:34.482507 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 18:33:34 crc kubenswrapper[4974]: I1013 18:33:34.483046 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="cfee0cca-5b93-4a97-acea-52b40d1e5a6b" containerName="kube-state-metrics" containerID="cri-o://dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267" gracePeriod=30 Oct 13 18:33:34 crc kubenswrapper[4974]: I1013 18:33:34.972288 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.112393 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkxtg\" (UniqueName: \"kubernetes.io/projected/cfee0cca-5b93-4a97-acea-52b40d1e5a6b-kube-api-access-dkxtg\") pod \"cfee0cca-5b93-4a97-acea-52b40d1e5a6b\" (UID: \"cfee0cca-5b93-4a97-acea-52b40d1e5a6b\") " Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.128531 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfee0cca-5b93-4a97-acea-52b40d1e5a6b-kube-api-access-dkxtg" (OuterVolumeSpecName: "kube-api-access-dkxtg") pod "cfee0cca-5b93-4a97-acea-52b40d1e5a6b" (UID: "cfee0cca-5b93-4a97-acea-52b40d1e5a6b"). InnerVolumeSpecName "kube-api-access-dkxtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.214947 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkxtg\" (UniqueName: \"kubernetes.io/projected/cfee0cca-5b93-4a97-acea-52b40d1e5a6b-kube-api-access-dkxtg\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.506535 4974 generic.go:334] "Generic (PLEG): container finished" podID="cfee0cca-5b93-4a97-acea-52b40d1e5a6b" containerID="dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267" exitCode=2 Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.506594 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.506610 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cfee0cca-5b93-4a97-acea-52b40d1e5a6b","Type":"ContainerDied","Data":"dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267"} Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.506732 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"cfee0cca-5b93-4a97-acea-52b40d1e5a6b","Type":"ContainerDied","Data":"bc233a57ea6a107b65dc4ed0c35fd58ad2bb9a76284a5e0d8bebd3a4d97514a8"} Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.506776 4974 scope.go:117] "RemoveContainer" containerID="dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.553581 4974 scope.go:117] "RemoveContainer" containerID="dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267" Oct 13 18:33:35 crc kubenswrapper[4974]: E1013 18:33:35.555324 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267\": container with ID starting with dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267 not found: ID does not exist" containerID="dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.555377 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267"} err="failed to get container status \"dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267\": rpc error: code = NotFound desc = could not find container \"dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267\": container with ID starting with dcc0bc9bf5a3d632ed7234eea7179b96742ee0d881a6abbb8efdf763ecd7f267 not found: ID does not exist" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.567031 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.586022 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.608855 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 18:33:35 crc kubenswrapper[4974]: E1013 18:33:35.609334 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfee0cca-5b93-4a97-acea-52b40d1e5a6b" containerName="kube-state-metrics" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.609358 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfee0cca-5b93-4a97-acea-52b40d1e5a6b" containerName="kube-state-metrics" Oct 13 18:33:35 crc kubenswrapper[4974]: E1013 18:33:35.609382 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.609391 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="89790087-1d9c-4278-b62f-e18a94775048" containerName="watcher-decision-engine" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.609612 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfee0cca-5b93-4a97-acea-52b40d1e5a6b" containerName="kube-state-metrics" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.610445 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.613055 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.613207 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.615504 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.722702 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8609b0d5-280b-498a-88da-2de3c7e27605-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8609b0d5-280b-498a-88da-2de3c7e27605\") " pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.722894 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8609b0d5-280b-498a-88da-2de3c7e27605-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8609b0d5-280b-498a-88da-2de3c7e27605\") " pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.723024 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dck5j\" (UniqueName: \"kubernetes.io/projected/8609b0d5-280b-498a-88da-2de3c7e27605-kube-api-access-dck5j\") pod \"kube-state-metrics-0\" (UID: \"8609b0d5-280b-498a-88da-2de3c7e27605\") " pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.723394 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8609b0d5-280b-498a-88da-2de3c7e27605-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8609b0d5-280b-498a-88da-2de3c7e27605\") " pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.824996 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8609b0d5-280b-498a-88da-2de3c7e27605-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8609b0d5-280b-498a-88da-2de3c7e27605\") " pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.825059 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8609b0d5-280b-498a-88da-2de3c7e27605-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8609b0d5-280b-498a-88da-2de3c7e27605\") " pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.825095 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dck5j\" (UniqueName: \"kubernetes.io/projected/8609b0d5-280b-498a-88da-2de3c7e27605-kube-api-access-dck5j\") pod \"kube-state-metrics-0\" (UID: \"8609b0d5-280b-498a-88da-2de3c7e27605\") " pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.825163 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8609b0d5-280b-498a-88da-2de3c7e27605-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8609b0d5-280b-498a-88da-2de3c7e27605\") " pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.837303 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8609b0d5-280b-498a-88da-2de3c7e27605-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"8609b0d5-280b-498a-88da-2de3c7e27605\") " pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.839491 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/8609b0d5-280b-498a-88da-2de3c7e27605-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"8609b0d5-280b-498a-88da-2de3c7e27605\") " pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.839572 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfee0cca-5b93-4a97-acea-52b40d1e5a6b" path="/var/lib/kubelet/pods/cfee0cca-5b93-4a97-acea-52b40d1e5a6b/volumes" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.846026 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dck5j\" (UniqueName: \"kubernetes.io/projected/8609b0d5-280b-498a-88da-2de3c7e27605-kube-api-access-dck5j\") pod \"kube-state-metrics-0\" (UID: \"8609b0d5-280b-498a-88da-2de3c7e27605\") " pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.856373 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/8609b0d5-280b-498a-88da-2de3c7e27605-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"8609b0d5-280b-498a-88da-2de3c7e27605\") " pod="openstack/kube-state-metrics-0" Oct 13 18:33:35 crc kubenswrapper[4974]: I1013 18:33:35.934492 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Oct 13 18:33:36 crc kubenswrapper[4974]: I1013 18:33:36.514736 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Oct 13 18:33:36 crc kubenswrapper[4974]: I1013 18:33:36.552157 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 18:33:36 crc kubenswrapper[4974]: I1013 18:33:36.792229 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:33:36 crc kubenswrapper[4974]: I1013 18:33:36.792817 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="ceilometer-central-agent" containerID="cri-o://d71191c00513f70a3d222080896d9471279acce934557e02b574ba747cbe439b" gracePeriod=30 Oct 13 18:33:36 crc kubenswrapper[4974]: I1013 18:33:36.793163 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="ceilometer-notification-agent" containerID="cri-o://54a4221f2a11ad2afdf813bedef42b1a37f5bb09a3b14b466c424fb30125d9d9" gracePeriod=30 Oct 13 18:33:36 crc kubenswrapper[4974]: I1013 18:33:36.793199 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="sg-core" containerID="cri-o://ddd156e87b0bf8444497f094801428bc202cf3441af5fb5b22c1eed63698124f" gracePeriod=30 Oct 13 18:33:36 crc kubenswrapper[4974]: I1013 18:33:36.793928 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="proxy-httpd" containerID="cri-o://018bcb54c1ffe93283f56836c36e63a9b336751b049f0148be56fad32fea2a47" gracePeriod=30 Oct 13 18:33:37 crc kubenswrapper[4974]: I1013 18:33:37.544591 4974 generic.go:334] "Generic (PLEG): container finished" podID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerID="018bcb54c1ffe93283f56836c36e63a9b336751b049f0148be56fad32fea2a47" exitCode=0 Oct 13 18:33:37 crc kubenswrapper[4974]: I1013 18:33:37.544952 4974 generic.go:334] "Generic (PLEG): container finished" podID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerID="ddd156e87b0bf8444497f094801428bc202cf3441af5fb5b22c1eed63698124f" exitCode=2 Oct 13 18:33:37 crc kubenswrapper[4974]: I1013 18:33:37.544965 4974 generic.go:334] "Generic (PLEG): container finished" podID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerID="d71191c00513f70a3d222080896d9471279acce934557e02b574ba747cbe439b" exitCode=0 Oct 13 18:33:37 crc kubenswrapper[4974]: I1013 18:33:37.544689 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f95a4e-d8a2-4655-9eac-51a72744bea2","Type":"ContainerDied","Data":"018bcb54c1ffe93283f56836c36e63a9b336751b049f0148be56fad32fea2a47"} Oct 13 18:33:37 crc kubenswrapper[4974]: I1013 18:33:37.545020 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f95a4e-d8a2-4655-9eac-51a72744bea2","Type":"ContainerDied","Data":"ddd156e87b0bf8444497f094801428bc202cf3441af5fb5b22c1eed63698124f"} Oct 13 18:33:37 crc kubenswrapper[4974]: I1013 18:33:37.545032 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f95a4e-d8a2-4655-9eac-51a72744bea2","Type":"ContainerDied","Data":"d71191c00513f70a3d222080896d9471279acce934557e02b574ba747cbe439b"} Oct 13 18:33:37 crc kubenswrapper[4974]: I1013 18:33:37.546212 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8609b0d5-280b-498a-88da-2de3c7e27605","Type":"ContainerStarted","Data":"d69a8fb0db5e46e471cc36b68a4d8a1aa58c5a92c58466d2d1ae0cab67f6311f"} Oct 13 18:33:37 crc kubenswrapper[4974]: I1013 18:33:37.546237 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8609b0d5-280b-498a-88da-2de3c7e27605","Type":"ContainerStarted","Data":"a4230ed1ec42ea10b158ee9480f6b501312d5bd2d7ebf5af0ababea52e93cac6"} Oct 13 18:33:37 crc kubenswrapper[4974]: I1013 18:33:37.546344 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Oct 13 18:33:37 crc kubenswrapper[4974]: I1013 18:33:37.570300 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.142419704 podStartE2EDuration="2.570279255s" podCreationTimestamp="2025-10-13 18:33:35 +0000 UTC" firstStartedPulling="2025-10-13 18:33:36.54998687 +0000 UTC m=+1151.454352940" lastFinishedPulling="2025-10-13 18:33:36.977846411 +0000 UTC m=+1151.882212491" observedRunningTime="2025-10-13 18:33:37.566081587 +0000 UTC m=+1152.470447687" watchObservedRunningTime="2025-10-13 18:33:37.570279255 +0000 UTC m=+1152.474645335" Oct 13 18:33:39 crc kubenswrapper[4974]: I1013 18:33:39.865587 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.366009 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-5q4sp"] Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.368027 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.369466 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.370335 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.380213 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5q4sp"] Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.409815 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82q4w\" (UniqueName: \"kubernetes.io/projected/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-kube-api-access-82q4w\") pod \"nova-cell0-cell-mapping-5q4sp\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.409877 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-config-data\") pod \"nova-cell0-cell-mapping-5q4sp\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.409956 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5q4sp\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.410029 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-scripts\") pod \"nova-cell0-cell-mapping-5q4sp\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.512129 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-scripts\") pod \"nova-cell0-cell-mapping-5q4sp\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.512247 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82q4w\" (UniqueName: \"kubernetes.io/projected/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-kube-api-access-82q4w\") pod \"nova-cell0-cell-mapping-5q4sp\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.512272 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-config-data\") pod \"nova-cell0-cell-mapping-5q4sp\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.512323 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5q4sp\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.520230 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.532446 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.538515 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-scripts\") pod \"nova-cell0-cell-mapping-5q4sp\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.538926 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.541926 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-5q4sp\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.551474 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82q4w\" (UniqueName: \"kubernetes.io/projected/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-kube-api-access-82q4w\") pod \"nova-cell0-cell-mapping-5q4sp\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.558552 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-config-data\") pod \"nova-cell0-cell-mapping-5q4sp\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.572370 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.626053 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndpmq\" (UniqueName: \"kubernetes.io/projected/76d80de1-0221-4438-9300-b1ac5c3b286b-kube-api-access-ndpmq\") pod \"nova-api-0\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.626092 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d80de1-0221-4438-9300-b1ac5c3b286b-config-data\") pod \"nova-api-0\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.626142 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76d80de1-0221-4438-9300-b1ac5c3b286b-logs\") pod \"nova-api-0\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.626158 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d80de1-0221-4438-9300-b1ac5c3b286b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.649616 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.651252 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.655154 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.671849 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.701604 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.729739 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502aab84-7155-4fcd-aecc-ada0de65f5d2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"502aab84-7155-4fcd-aecc-ada0de65f5d2\") " pod="openstack/nova-scheduler-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.729799 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndpmq\" (UniqueName: \"kubernetes.io/projected/76d80de1-0221-4438-9300-b1ac5c3b286b-kube-api-access-ndpmq\") pod \"nova-api-0\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.729826 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d80de1-0221-4438-9300-b1ac5c3b286b-config-data\") pod \"nova-api-0\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.729899 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76d80de1-0221-4438-9300-b1ac5c3b286b-logs\") pod \"nova-api-0\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.729913 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d80de1-0221-4438-9300-b1ac5c3b286b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.729980 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqwbz\" (UniqueName: \"kubernetes.io/projected/502aab84-7155-4fcd-aecc-ada0de65f5d2-kube-api-access-bqwbz\") pod \"nova-scheduler-0\" (UID: \"502aab84-7155-4fcd-aecc-ada0de65f5d2\") " pod="openstack/nova-scheduler-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.730033 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502aab84-7155-4fcd-aecc-ada0de65f5d2-config-data\") pod \"nova-scheduler-0\" (UID: \"502aab84-7155-4fcd-aecc-ada0de65f5d2\") " pod="openstack/nova-scheduler-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.734970 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76d80de1-0221-4438-9300-b1ac5c3b286b-logs\") pod \"nova-api-0\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.735182 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d80de1-0221-4438-9300-b1ac5c3b286b-config-data\") pod \"nova-api-0\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.737776 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d80de1-0221-4438-9300-b1ac5c3b286b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.752509 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.754261 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.757485 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndpmq\" (UniqueName: \"kubernetes.io/projected/76d80de1-0221-4438-9300-b1ac5c3b286b-kube-api-access-ndpmq\") pod \"nova-api-0\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.758024 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.778120 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.779644 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.781952 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.819702 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.831564 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502aab84-7155-4fcd-aecc-ada0de65f5d2-config-data\") pod \"nova-scheduler-0\" (UID: \"502aab84-7155-4fcd-aecc-ada0de65f5d2\") " pod="openstack/nova-scheduler-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.831623 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bce935-6ffd-4772-bc4f-78eed1372e60-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"34bce935-6ffd-4772-bc4f-78eed1372e60\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.831646 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1523b55e-3b63-457e-a789-ff0254119cf7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.831686 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502aab84-7155-4fcd-aecc-ada0de65f5d2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"502aab84-7155-4fcd-aecc-ada0de65f5d2\") " pod="openstack/nova-scheduler-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.831700 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bce935-6ffd-4772-bc4f-78eed1372e60-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"34bce935-6ffd-4772-bc4f-78eed1372e60\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.831760 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rplw\" (UniqueName: \"kubernetes.io/projected/1523b55e-3b63-457e-a789-ff0254119cf7-kube-api-access-7rplw\") pod \"nova-metadata-0\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.831815 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqwbz\" (UniqueName: \"kubernetes.io/projected/502aab84-7155-4fcd-aecc-ada0de65f5d2-kube-api-access-bqwbz\") pod \"nova-scheduler-0\" (UID: \"502aab84-7155-4fcd-aecc-ada0de65f5d2\") " pod="openstack/nova-scheduler-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.831832 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1523b55e-3b63-457e-a789-ff0254119cf7-config-data\") pod \"nova-metadata-0\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.831850 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fr2m\" (UniqueName: \"kubernetes.io/projected/34bce935-6ffd-4772-bc4f-78eed1372e60-kube-api-access-4fr2m\") pod \"nova-cell1-novncproxy-0\" (UID: \"34bce935-6ffd-4772-bc4f-78eed1372e60\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.831876 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1523b55e-3b63-457e-a789-ff0254119cf7-logs\") pod \"nova-metadata-0\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.844691 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.844924 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502aab84-7155-4fcd-aecc-ada0de65f5d2-config-data\") pod \"nova-scheduler-0\" (UID: \"502aab84-7155-4fcd-aecc-ada0de65f5d2\") " pod="openstack/nova-scheduler-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.851530 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502aab84-7155-4fcd-aecc-ada0de65f5d2-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"502aab84-7155-4fcd-aecc-ada0de65f5d2\") " pod="openstack/nova-scheduler-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.861379 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqwbz\" (UniqueName: \"kubernetes.io/projected/502aab84-7155-4fcd-aecc-ada0de65f5d2-kube-api-access-bqwbz\") pod \"nova-scheduler-0\" (UID: \"502aab84-7155-4fcd-aecc-ada0de65f5d2\") " pod="openstack/nova-scheduler-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.885773 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75d8d75995-bsnd6"] Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.889241 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.914585 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75d8d75995-bsnd6"] Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.935801 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bce935-6ffd-4772-bc4f-78eed1372e60-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"34bce935-6ffd-4772-bc4f-78eed1372e60\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.935898 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-dns-svc\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.935953 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-ovsdbserver-sb\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.936026 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rplw\" (UniqueName: \"kubernetes.io/projected/1523b55e-3b63-457e-a789-ff0254119cf7-kube-api-access-7rplw\") pod \"nova-metadata-0\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.936148 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtkkv\" (UniqueName: \"kubernetes.io/projected/6d898346-454c-4164-ba27-25363b7a75cb-kube-api-access-wtkkv\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.936220 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1523b55e-3b63-457e-a789-ff0254119cf7-config-data\") pod \"nova-metadata-0\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.936293 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fr2m\" (UniqueName: \"kubernetes.io/projected/34bce935-6ffd-4772-bc4f-78eed1372e60-kube-api-access-4fr2m\") pod \"nova-cell1-novncproxy-0\" (UID: \"34bce935-6ffd-4772-bc4f-78eed1372e60\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.936321 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-config\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.936429 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1523b55e-3b63-457e-a789-ff0254119cf7-logs\") pod \"nova-metadata-0\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.936547 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-ovsdbserver-nb\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.936601 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-dns-swift-storage-0\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.936635 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bce935-6ffd-4772-bc4f-78eed1372e60-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"34bce935-6ffd-4772-bc4f-78eed1372e60\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.936951 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1523b55e-3b63-457e-a789-ff0254119cf7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.939274 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1523b55e-3b63-457e-a789-ff0254119cf7-logs\") pod \"nova-metadata-0\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.940496 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1523b55e-3b63-457e-a789-ff0254119cf7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.941295 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.942950 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1523b55e-3b63-457e-a789-ff0254119cf7-config-data\") pod \"nova-metadata-0\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.943506 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bce935-6ffd-4772-bc4f-78eed1372e60-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"34bce935-6ffd-4772-bc4f-78eed1372e60\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.944375 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bce935-6ffd-4772-bc4f-78eed1372e60-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"34bce935-6ffd-4772-bc4f-78eed1372e60\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.960544 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rplw\" (UniqueName: \"kubernetes.io/projected/1523b55e-3b63-457e-a789-ff0254119cf7-kube-api-access-7rplw\") pod \"nova-metadata-0\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.962265 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fr2m\" (UniqueName: \"kubernetes.io/projected/34bce935-6ffd-4772-bc4f-78eed1372e60-kube-api-access-4fr2m\") pod \"nova-cell1-novncproxy-0\" (UID: \"34bce935-6ffd-4772-bc4f-78eed1372e60\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:33:40 crc kubenswrapper[4974]: I1013 18:33:40.975721 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.039772 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtkkv\" (UniqueName: \"kubernetes.io/projected/6d898346-454c-4164-ba27-25363b7a75cb-kube-api-access-wtkkv\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.039931 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-config\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.040010 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-ovsdbserver-nb\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.040030 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-dns-swift-storage-0\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.040178 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-dns-svc\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.040251 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-ovsdbserver-sb\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.040856 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-config\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.042085 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-ovsdbserver-sb\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.042411 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-ovsdbserver-nb\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.042953 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-dns-svc\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.055335 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-dns-swift-storage-0\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.061250 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtkkv\" (UniqueName: \"kubernetes.io/projected/6d898346-454c-4164-ba27-25363b7a75cb-kube-api-access-wtkkv\") pod \"dnsmasq-dns-75d8d75995-bsnd6\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.213600 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.228598 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.237295 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.283681 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-5q4sp"] Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.472929 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.594273 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.612396 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.622854 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76d80de1-0221-4438-9300-b1ac5c3b286b","Type":"ContainerStarted","Data":"57436b4028c9ff641e196b39a19110dfd11492a207f847cb77d8c86e9469e3c2"} Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.628536 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5q4sp" event={"ID":"2f00ec4d-7131-4fbd-9077-53b53ea0abc1","Type":"ContainerStarted","Data":"cca5ce5d3932d896ce962b65b92ad60d8c7b98ae647438b2495d5c5deb0372a2"} Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.628586 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5q4sp" event={"ID":"2f00ec4d-7131-4fbd-9077-53b53ea0abc1","Type":"ContainerStarted","Data":"5db7662fb08af936c5a5bbe1f27b74fd70ee38530475f4b6139c888627331d9d"} Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.634028 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"502aab84-7155-4fcd-aecc-ada0de65f5d2","Type":"ContainerStarted","Data":"e462cea78cbf4d77dff7a66d6cf38597175583b235ec775f1f4b87a2d2eb234c"} Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.653592 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-5q4sp" podStartSLOduration=1.6535778300000001 podStartE2EDuration="1.65357783s" podCreationTimestamp="2025-10-13 18:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:33:41.649033822 +0000 UTC m=+1156.553399902" watchObservedRunningTime="2025-10-13 18:33:41.65357783 +0000 UTC m=+1156.557943910" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.750753 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w4zh2"] Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.751910 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.757216 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.757767 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.778163 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w4zh2"] Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.858832 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-w4zh2\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.858944 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-config-data\") pod \"nova-cell1-conductor-db-sync-w4zh2\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.858997 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z8nx\" (UniqueName: \"kubernetes.io/projected/e3edde18-d8b7-4d74-a226-2078fb905c13-kube-api-access-8z8nx\") pod \"nova-cell1-conductor-db-sync-w4zh2\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.859024 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-scripts\") pod \"nova-cell1-conductor-db-sync-w4zh2\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.878743 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75d8d75995-bsnd6"] Oct 13 18:33:41 crc kubenswrapper[4974]: W1013 18:33:41.902275 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d898346_454c_4164_ba27_25363b7a75cb.slice/crio-e1a7c1adad193fface5d48b2dc51eac0279c0f4f0186c9f02487dc8a0b419696 WatchSource:0}: Error finding container e1a7c1adad193fface5d48b2dc51eac0279c0f4f0186c9f02487dc8a0b419696: Status 404 returned error can't find the container with id e1a7c1adad193fface5d48b2dc51eac0279c0f4f0186c9f02487dc8a0b419696 Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.962433 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-w4zh2\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.962698 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-config-data\") pod \"nova-cell1-conductor-db-sync-w4zh2\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.962897 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z8nx\" (UniqueName: \"kubernetes.io/projected/e3edde18-d8b7-4d74-a226-2078fb905c13-kube-api-access-8z8nx\") pod \"nova-cell1-conductor-db-sync-w4zh2\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.962956 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-scripts\") pod \"nova-cell1-conductor-db-sync-w4zh2\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.970818 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-w4zh2\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.970883 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-config-data\") pod \"nova-cell1-conductor-db-sync-w4zh2\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.976705 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-scripts\") pod \"nova-cell1-conductor-db-sync-w4zh2\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.983264 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z8nx\" (UniqueName: \"kubernetes.io/projected/e3edde18-d8b7-4d74-a226-2078fb905c13-kube-api-access-8z8nx\") pod \"nova-cell1-conductor-db-sync-w4zh2\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:41 crc kubenswrapper[4974]: I1013 18:33:41.983584 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:42 crc kubenswrapper[4974]: I1013 18:33:42.081141 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:42 crc kubenswrapper[4974]: I1013 18:33:42.668830 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1523b55e-3b63-457e-a789-ff0254119cf7","Type":"ContainerStarted","Data":"d5ca56444c92625f79ea0bc3dc8f3e132a18cb57e96ac2cfd175540f6cc1c8ad"} Oct 13 18:33:42 crc kubenswrapper[4974]: I1013 18:33:42.670778 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"34bce935-6ffd-4772-bc4f-78eed1372e60","Type":"ContainerStarted","Data":"009f9997928a234e5cb3f1ad2bac6ac4810a3f2ffcb29eb6ae31d19f4eea410e"} Oct 13 18:33:42 crc kubenswrapper[4974]: I1013 18:33:42.672372 4974 generic.go:334] "Generic (PLEG): container finished" podID="6d898346-454c-4164-ba27-25363b7a75cb" containerID="aafa3f24015eeebd60bed988b4136b8e4f9d40ad22d2267a398d88f7ced859ca" exitCode=0 Oct 13 18:33:42 crc kubenswrapper[4974]: I1013 18:33:42.672706 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" event={"ID":"6d898346-454c-4164-ba27-25363b7a75cb","Type":"ContainerDied","Data":"aafa3f24015eeebd60bed988b4136b8e4f9d40ad22d2267a398d88f7ced859ca"} Oct 13 18:33:42 crc kubenswrapper[4974]: I1013 18:33:42.672741 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" event={"ID":"6d898346-454c-4164-ba27-25363b7a75cb","Type":"ContainerStarted","Data":"e1a7c1adad193fface5d48b2dc51eac0279c0f4f0186c9f02487dc8a0b419696"} Oct 13 18:33:42 crc kubenswrapper[4974]: I1013 18:33:42.681392 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w4zh2"] Oct 13 18:33:43 crc kubenswrapper[4974]: I1013 18:33:43.690475 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" event={"ID":"6d898346-454c-4164-ba27-25363b7a75cb","Type":"ContainerStarted","Data":"18c068ad3700e763b5e36120aff96bf845ccd0dd0a63e9e1d87076ba9a268b70"} Oct 13 18:33:43 crc kubenswrapper[4974]: I1013 18:33:43.691219 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:43 crc kubenswrapper[4974]: I1013 18:33:43.696268 4974 generic.go:334] "Generic (PLEG): container finished" podID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerID="54a4221f2a11ad2afdf813bedef42b1a37f5bb09a3b14b466c424fb30125d9d9" exitCode=0 Oct 13 18:33:43 crc kubenswrapper[4974]: I1013 18:33:43.696324 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f95a4e-d8a2-4655-9eac-51a72744bea2","Type":"ContainerDied","Data":"54a4221f2a11ad2afdf813bedef42b1a37f5bb09a3b14b466c424fb30125d9d9"} Oct 13 18:33:43 crc kubenswrapper[4974]: I1013 18:33:43.697723 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-w4zh2" event={"ID":"e3edde18-d8b7-4d74-a226-2078fb905c13","Type":"ContainerStarted","Data":"c54198f199657ad749421ec9dd41c988fc735d4998cf1446050d644c4397ef0e"} Oct 13 18:33:43 crc kubenswrapper[4974]: I1013 18:33:43.697759 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-w4zh2" event={"ID":"e3edde18-d8b7-4d74-a226-2078fb905c13","Type":"ContainerStarted","Data":"ab7aa7c3cd510cda40a893bee144075f1288af0694b1399cf09bd7380565795b"} Oct 13 18:33:43 crc kubenswrapper[4974]: I1013 18:33:43.716097 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" podStartSLOduration=3.716076351 podStartE2EDuration="3.716076351s" podCreationTimestamp="2025-10-13 18:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:33:43.708013684 +0000 UTC m=+1158.612379764" watchObservedRunningTime="2025-10-13 18:33:43.716076351 +0000 UTC m=+1158.620442431" Oct 13 18:33:43 crc kubenswrapper[4974]: I1013 18:33:43.726042 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-w4zh2" podStartSLOduration=2.726027521 podStartE2EDuration="2.726027521s" podCreationTimestamp="2025-10-13 18:33:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:33:43.720608348 +0000 UTC m=+1158.624974428" watchObservedRunningTime="2025-10-13 18:33:43.726027521 +0000 UTC m=+1158.630393601" Oct 13 18:33:44 crc kubenswrapper[4974]: I1013 18:33:44.786755 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 18:33:44 crc kubenswrapper[4974]: I1013 18:33:44.806057 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.646756 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.763837 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"63f95a4e-d8a2-4655-9eac-51a72744bea2","Type":"ContainerDied","Data":"db61c6bbea4c012801790bc1d72fc7a01e448f59aea43d95ea8afb6c555d2a5a"} Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.763887 4974 scope.go:117] "RemoveContainer" containerID="018bcb54c1ffe93283f56836c36e63a9b336751b049f0148be56fad32fea2a47" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.763977 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.772592 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-scripts\") pod \"63f95a4e-d8a2-4655-9eac-51a72744bea2\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.772759 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f95a4e-d8a2-4655-9eac-51a72744bea2-run-httpd\") pod \"63f95a4e-d8a2-4655-9eac-51a72744bea2\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.772781 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f95a4e-d8a2-4655-9eac-51a72744bea2-log-httpd\") pod \"63f95a4e-d8a2-4655-9eac-51a72744bea2\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.772823 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6dxw\" (UniqueName: \"kubernetes.io/projected/63f95a4e-d8a2-4655-9eac-51a72744bea2-kube-api-access-n6dxw\") pod \"63f95a4e-d8a2-4655-9eac-51a72744bea2\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.772855 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-sg-core-conf-yaml\") pod \"63f95a4e-d8a2-4655-9eac-51a72744bea2\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.772888 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-combined-ca-bundle\") pod \"63f95a4e-d8a2-4655-9eac-51a72744bea2\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.772995 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-config-data\") pod \"63f95a4e-d8a2-4655-9eac-51a72744bea2\" (UID: \"63f95a4e-d8a2-4655-9eac-51a72744bea2\") " Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.774100 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f95a4e-d8a2-4655-9eac-51a72744bea2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "63f95a4e-d8a2-4655-9eac-51a72744bea2" (UID: "63f95a4e-d8a2-4655-9eac-51a72744bea2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.774329 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63f95a4e-d8a2-4655-9eac-51a72744bea2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "63f95a4e-d8a2-4655-9eac-51a72744bea2" (UID: "63f95a4e-d8a2-4655-9eac-51a72744bea2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.779885 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63f95a4e-d8a2-4655-9eac-51a72744bea2-kube-api-access-n6dxw" (OuterVolumeSpecName: "kube-api-access-n6dxw") pod "63f95a4e-d8a2-4655-9eac-51a72744bea2" (UID: "63f95a4e-d8a2-4655-9eac-51a72744bea2"). InnerVolumeSpecName "kube-api-access-n6dxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.782387 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-scripts" (OuterVolumeSpecName: "scripts") pod "63f95a4e-d8a2-4655-9eac-51a72744bea2" (UID: "63f95a4e-d8a2-4655-9eac-51a72744bea2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.831891 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "63f95a4e-d8a2-4655-9eac-51a72744bea2" (UID: "63f95a4e-d8a2-4655-9eac-51a72744bea2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.856140 4974 scope.go:117] "RemoveContainer" containerID="ddd156e87b0bf8444497f094801428bc202cf3441af5fb5b22c1eed63698124f" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.876005 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.876029 4974 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f95a4e-d8a2-4655-9eac-51a72744bea2-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.876038 4974 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/63f95a4e-d8a2-4655-9eac-51a72744bea2-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.876049 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6dxw\" (UniqueName: \"kubernetes.io/projected/63f95a4e-d8a2-4655-9eac-51a72744bea2-kube-api-access-n6dxw\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.876058 4974 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.901347 4974 scope.go:117] "RemoveContainer" containerID="54a4221f2a11ad2afdf813bedef42b1a37f5bb09a3b14b466c424fb30125d9d9" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.932547 4974 scope.go:117] "RemoveContainer" containerID="d71191c00513f70a3d222080896d9471279acce934557e02b574ba747cbe439b" Oct 13 18:33:45 crc kubenswrapper[4974]: I1013 18:33:45.947907 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.038229 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63f95a4e-d8a2-4655-9eac-51a72744bea2" (UID: "63f95a4e-d8a2-4655-9eac-51a72744bea2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.047780 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-config-data" (OuterVolumeSpecName: "config-data") pod "63f95a4e-d8a2-4655-9eac-51a72744bea2" (UID: "63f95a4e-d8a2-4655-9eac-51a72744bea2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.080789 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.080822 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63f95a4e-d8a2-4655-9eac-51a72744bea2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.111627 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.126160 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.138670 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:33:46 crc kubenswrapper[4974]: E1013 18:33:46.139047 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="proxy-httpd" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.139062 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="proxy-httpd" Oct 13 18:33:46 crc kubenswrapper[4974]: E1013 18:33:46.139073 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="sg-core" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.139079 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="sg-core" Oct 13 18:33:46 crc kubenswrapper[4974]: E1013 18:33:46.139103 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="ceilometer-notification-agent" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.139109 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="ceilometer-notification-agent" Oct 13 18:33:46 crc kubenswrapper[4974]: E1013 18:33:46.139133 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="ceilometer-central-agent" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.139138 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="ceilometer-central-agent" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.139754 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="ceilometer-notification-agent" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.140088 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="sg-core" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.140111 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="ceilometer-central-agent" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.140166 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" containerName="proxy-httpd" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.144155 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.178547 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.178830 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.178695 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.182059 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-scripts\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.182122 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/616aad84-5d9d-41d6-9342-67199ffade4a-log-httpd\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.182157 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.182185 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-config-data\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.182210 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/616aad84-5d9d-41d6-9342-67199ffade4a-run-httpd\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.182237 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.182256 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2sqv\" (UniqueName: \"kubernetes.io/projected/616aad84-5d9d-41d6-9342-67199ffade4a-kube-api-access-f2sqv\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.182276 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.191607 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.285777 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-scripts\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.285840 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/616aad84-5d9d-41d6-9342-67199ffade4a-log-httpd\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.285875 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.285904 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-config-data\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.285930 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/616aad84-5d9d-41d6-9342-67199ffade4a-run-httpd\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.285956 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.285976 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2sqv\" (UniqueName: \"kubernetes.io/projected/616aad84-5d9d-41d6-9342-67199ffade4a-kube-api-access-f2sqv\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.285998 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.286584 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/616aad84-5d9d-41d6-9342-67199ffade4a-log-httpd\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.286749 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/616aad84-5d9d-41d6-9342-67199ffade4a-run-httpd\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.289292 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.289942 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-scripts\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.291411 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-config-data\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.294945 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.295246 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.303261 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2sqv\" (UniqueName: \"kubernetes.io/projected/616aad84-5d9d-41d6-9342-67199ffade4a-kube-api-access-f2sqv\") pod \"ceilometer-0\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.521485 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.774522 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76d80de1-0221-4438-9300-b1ac5c3b286b","Type":"ContainerStarted","Data":"9f96a353f10adc6fd44eeb6c50ea1200eeba6435a1a14877d430fa7b5a95da06"} Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.774565 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76d80de1-0221-4438-9300-b1ac5c3b286b","Type":"ContainerStarted","Data":"97e1cdd995828883ad1d9be74d671d7f5f85484183dbe27b3b62eeff9c21ada4"} Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.792457 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1523b55e-3b63-457e-a789-ff0254119cf7","Type":"ContainerStarted","Data":"4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6"} Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.792501 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1523b55e-3b63-457e-a789-ff0254119cf7","Type":"ContainerStarted","Data":"df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d"} Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.792613 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1523b55e-3b63-457e-a789-ff0254119cf7" containerName="nova-metadata-log" containerID="cri-o://df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d" gracePeriod=30 Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.792894 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="1523b55e-3b63-457e-a789-ff0254119cf7" containerName="nova-metadata-metadata" containerID="cri-o://4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6" gracePeriod=30 Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.804282 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"502aab84-7155-4fcd-aecc-ada0de65f5d2","Type":"ContainerStarted","Data":"96cd22203fba9524159bc61ad032f605190b0b5f60e6571e95d8d40617d712f5"} Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.806340 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.753204482 podStartE2EDuration="6.806321097s" podCreationTimestamp="2025-10-13 18:33:40 +0000 UTC" firstStartedPulling="2025-10-13 18:33:41.484158309 +0000 UTC m=+1156.388524389" lastFinishedPulling="2025-10-13 18:33:45.537274904 +0000 UTC m=+1160.441641004" observedRunningTime="2025-10-13 18:33:46.797396966 +0000 UTC m=+1161.701763046" watchObservedRunningTime="2025-10-13 18:33:46.806321097 +0000 UTC m=+1161.710687177" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.815848 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"34bce935-6ffd-4772-bc4f-78eed1372e60","Type":"ContainerStarted","Data":"1241cb87cad276d93066d5ab9659eb3ac6b16d2c9d8fc4680822ca212cacbc47"} Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.815957 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="34bce935-6ffd-4772-bc4f-78eed1372e60" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://1241cb87cad276d93066d5ab9659eb3ac6b16d2c9d8fc4680822ca212cacbc47" gracePeriod=30 Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.854223 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.275339527 podStartE2EDuration="6.854202305s" podCreationTimestamp="2025-10-13 18:33:40 +0000 UTC" firstStartedPulling="2025-10-13 18:33:41.985315114 +0000 UTC m=+1156.889681194" lastFinishedPulling="2025-10-13 18:33:45.564177872 +0000 UTC m=+1160.468543972" observedRunningTime="2025-10-13 18:33:46.832763732 +0000 UTC m=+1161.737129812" watchObservedRunningTime="2025-10-13 18:33:46.854202305 +0000 UTC m=+1161.758568385" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.859423 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.042168591 podStartE2EDuration="6.859407972s" podCreationTimestamp="2025-10-13 18:33:40 +0000 UTC" firstStartedPulling="2025-10-13 18:33:41.683912175 +0000 UTC m=+1156.588278255" lastFinishedPulling="2025-10-13 18:33:45.501151556 +0000 UTC m=+1160.405517636" observedRunningTime="2025-10-13 18:33:46.85825889 +0000 UTC m=+1161.762624970" watchObservedRunningTime="2025-10-13 18:33:46.859407972 +0000 UTC m=+1161.763774052" Oct 13 18:33:46 crc kubenswrapper[4974]: I1013 18:33:46.889400 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.957512456 podStartE2EDuration="6.889384226s" podCreationTimestamp="2025-10-13 18:33:40 +0000 UTC" firstStartedPulling="2025-10-13 18:33:41.606507985 +0000 UTC m=+1156.510874065" lastFinishedPulling="2025-10-13 18:33:45.538379735 +0000 UTC m=+1160.442745835" observedRunningTime="2025-10-13 18:33:46.88169503 +0000 UTC m=+1161.786061110" watchObservedRunningTime="2025-10-13 18:33:46.889384226 +0000 UTC m=+1161.793750306" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.072754 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.318354 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.434273 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1523b55e-3b63-457e-a789-ff0254119cf7-combined-ca-bundle\") pod \"1523b55e-3b63-457e-a789-ff0254119cf7\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.434330 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1523b55e-3b63-457e-a789-ff0254119cf7-logs\") pod \"1523b55e-3b63-457e-a789-ff0254119cf7\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.434364 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1523b55e-3b63-457e-a789-ff0254119cf7-config-data\") pod \"1523b55e-3b63-457e-a789-ff0254119cf7\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.434425 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rplw\" (UniqueName: \"kubernetes.io/projected/1523b55e-3b63-457e-a789-ff0254119cf7-kube-api-access-7rplw\") pod \"1523b55e-3b63-457e-a789-ff0254119cf7\" (UID: \"1523b55e-3b63-457e-a789-ff0254119cf7\") " Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.435254 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1523b55e-3b63-457e-a789-ff0254119cf7-logs" (OuterVolumeSpecName: "logs") pod "1523b55e-3b63-457e-a789-ff0254119cf7" (UID: "1523b55e-3b63-457e-a789-ff0254119cf7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.438725 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1523b55e-3b63-457e-a789-ff0254119cf7-kube-api-access-7rplw" (OuterVolumeSpecName: "kube-api-access-7rplw") pod "1523b55e-3b63-457e-a789-ff0254119cf7" (UID: "1523b55e-3b63-457e-a789-ff0254119cf7"). InnerVolumeSpecName "kube-api-access-7rplw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.468112 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1523b55e-3b63-457e-a789-ff0254119cf7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1523b55e-3b63-457e-a789-ff0254119cf7" (UID: "1523b55e-3b63-457e-a789-ff0254119cf7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.478137 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1523b55e-3b63-457e-a789-ff0254119cf7-config-data" (OuterVolumeSpecName: "config-data") pod "1523b55e-3b63-457e-a789-ff0254119cf7" (UID: "1523b55e-3b63-457e-a789-ff0254119cf7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.539018 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1523b55e-3b63-457e-a789-ff0254119cf7-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.539281 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rplw\" (UniqueName: \"kubernetes.io/projected/1523b55e-3b63-457e-a789-ff0254119cf7-kube-api-access-7rplw\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.539571 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1523b55e-3b63-457e-a789-ff0254119cf7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.539663 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1523b55e-3b63-457e-a789-ff0254119cf7-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.831631 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63f95a4e-d8a2-4655-9eac-51a72744bea2" path="/var/lib/kubelet/pods/63f95a4e-d8a2-4655-9eac-51a72744bea2/volumes" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.838132 4974 generic.go:334] "Generic (PLEG): container finished" podID="1523b55e-3b63-457e-a789-ff0254119cf7" containerID="4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6" exitCode=0 Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.838376 4974 generic.go:334] "Generic (PLEG): container finished" podID="1523b55e-3b63-457e-a789-ff0254119cf7" containerID="df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d" exitCode=143 Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.838586 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1523b55e-3b63-457e-a789-ff0254119cf7","Type":"ContainerDied","Data":"4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6"} Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.838765 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1523b55e-3b63-457e-a789-ff0254119cf7","Type":"ContainerDied","Data":"df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d"} Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.838912 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"1523b55e-3b63-457e-a789-ff0254119cf7","Type":"ContainerDied","Data":"d5ca56444c92625f79ea0bc3dc8f3e132a18cb57e96ac2cfd175540f6cc1c8ad"} Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.839052 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.839058 4974 scope.go:117] "RemoveContainer" containerID="4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.849072 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"616aad84-5d9d-41d6-9342-67199ffade4a","Type":"ContainerStarted","Data":"aed60a5c13f637f89fa1e08e9857003f10646a88f4d3cd9ff45783c355390c62"} Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.849232 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"616aad84-5d9d-41d6-9342-67199ffade4a","Type":"ContainerStarted","Data":"c52aabfef332cc4dda8b2cfdd1c2e5a3f1f3030ff41a4fe490ffb8cd9fdc98f4"} Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.849323 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"616aad84-5d9d-41d6-9342-67199ffade4a","Type":"ContainerStarted","Data":"2a5addbc223f2ddebd5eb30034700bf1f979f3cc54f7d1e1c9921fa42dd043ba"} Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.864490 4974 scope.go:117] "RemoveContainer" containerID="df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.875201 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.886643 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.893051 4974 scope.go:117] "RemoveContainer" containerID="4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6" Oct 13 18:33:47 crc kubenswrapper[4974]: E1013 18:33:47.893627 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6\": container with ID starting with 4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6 not found: ID does not exist" containerID="4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.893680 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6"} err="failed to get container status \"4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6\": rpc error: code = NotFound desc = could not find container \"4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6\": container with ID starting with 4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6 not found: ID does not exist" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.893706 4974 scope.go:117] "RemoveContainer" containerID="df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d" Oct 13 18:33:47 crc kubenswrapper[4974]: E1013 18:33:47.893933 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d\": container with ID starting with df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d not found: ID does not exist" containerID="df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.893961 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d"} err="failed to get container status \"df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d\": rpc error: code = NotFound desc = could not find container \"df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d\": container with ID starting with df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d not found: ID does not exist" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.893979 4974 scope.go:117] "RemoveContainer" containerID="4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.894200 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6"} err="failed to get container status \"4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6\": rpc error: code = NotFound desc = could not find container \"4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6\": container with ID starting with 4e346cd2d55690479a959604ebe0e410264c377fa4287e5a6af4c782157aa6b6 not found: ID does not exist" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.894222 4974 scope.go:117] "RemoveContainer" containerID="df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.895018 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d"} err="failed to get container status \"df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d\": rpc error: code = NotFound desc = could not find container \"df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d\": container with ID starting with df2e5aab9c45f463b08d0bd03f2d5ef363b9a6fec725b37bb8583f15c287be4d not found: ID does not exist" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.912229 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:47 crc kubenswrapper[4974]: E1013 18:33:47.912850 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1523b55e-3b63-457e-a789-ff0254119cf7" containerName="nova-metadata-metadata" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.912870 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="1523b55e-3b63-457e-a789-ff0254119cf7" containerName="nova-metadata-metadata" Oct 13 18:33:47 crc kubenswrapper[4974]: E1013 18:33:47.912900 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1523b55e-3b63-457e-a789-ff0254119cf7" containerName="nova-metadata-log" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.912908 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="1523b55e-3b63-457e-a789-ff0254119cf7" containerName="nova-metadata-log" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.913158 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="1523b55e-3b63-457e-a789-ff0254119cf7" containerName="nova-metadata-log" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.913191 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="1523b55e-3b63-457e-a789-ff0254119cf7" containerName="nova-metadata-metadata" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.914318 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.917763 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.918097 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 18:33:47 crc kubenswrapper[4974]: I1013 18:33:47.954424 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.063209 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-config-data\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.063530 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.063641 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.063782 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53157d8c-7fed-48de-9714-cab91461a8f7-logs\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.063940 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs8gh\" (UniqueName: \"kubernetes.io/projected/53157d8c-7fed-48de-9714-cab91461a8f7-kube-api-access-gs8gh\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.176786 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.176844 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.176880 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53157d8c-7fed-48de-9714-cab91461a8f7-logs\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.176952 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8gh\" (UniqueName: \"kubernetes.io/projected/53157d8c-7fed-48de-9714-cab91461a8f7-kube-api-access-gs8gh\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.176986 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-config-data\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.180040 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53157d8c-7fed-48de-9714-cab91461a8f7-logs\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.185590 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-config-data\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.209299 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.211618 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8gh\" (UniqueName: \"kubernetes.io/projected/53157d8c-7fed-48de-9714-cab91461a8f7-kube-api-access-gs8gh\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.213030 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.258208 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.668447 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.872734 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53157d8c-7fed-48de-9714-cab91461a8f7","Type":"ContainerStarted","Data":"451f44470147663ddf6904202559d513a74532b4036e575ad100247046cfeec4"} Oct 13 18:33:48 crc kubenswrapper[4974]: I1013 18:33:48.879879 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"616aad84-5d9d-41d6-9342-67199ffade4a","Type":"ContainerStarted","Data":"30c4533ea9b2d1d827cad3870c8a2364fcba5120f0d88a2158276d93e8e5e5d4"} Oct 13 18:33:49 crc kubenswrapper[4974]: I1013 18:33:49.827073 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1523b55e-3b63-457e-a789-ff0254119cf7" path="/var/lib/kubelet/pods/1523b55e-3b63-457e-a789-ff0254119cf7/volumes" Oct 13 18:33:49 crc kubenswrapper[4974]: I1013 18:33:49.892995 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53157d8c-7fed-48de-9714-cab91461a8f7","Type":"ContainerStarted","Data":"37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f"} Oct 13 18:33:49 crc kubenswrapper[4974]: I1013 18:33:49.893038 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53157d8c-7fed-48de-9714-cab91461a8f7","Type":"ContainerStarted","Data":"bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093"} Oct 13 18:33:49 crc kubenswrapper[4974]: I1013 18:33:49.912833 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.91281956 podStartE2EDuration="2.91281956s" podCreationTimestamp="2025-10-13 18:33:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:33:49.910415653 +0000 UTC m=+1164.814781733" watchObservedRunningTime="2025-10-13 18:33:49.91281956 +0000 UTC m=+1164.817185640" Oct 13 18:33:50 crc kubenswrapper[4974]: I1013 18:33:50.908692 4974 generic.go:334] "Generic (PLEG): container finished" podID="2f00ec4d-7131-4fbd-9077-53b53ea0abc1" containerID="cca5ce5d3932d896ce962b65b92ad60d8c7b98ae647438b2495d5c5deb0372a2" exitCode=0 Oct 13 18:33:50 crc kubenswrapper[4974]: I1013 18:33:50.909028 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5q4sp" event={"ID":"2f00ec4d-7131-4fbd-9077-53b53ea0abc1","Type":"ContainerDied","Data":"cca5ce5d3932d896ce962b65b92ad60d8c7b98ae647438b2495d5c5deb0372a2"} Oct 13 18:33:50 crc kubenswrapper[4974]: I1013 18:33:50.912947 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"616aad84-5d9d-41d6-9342-67199ffade4a","Type":"ContainerStarted","Data":"29f125fc4dce659e6f293987d08ade7a2608d844f008f3a491681cb5103d0077"} Oct 13 18:33:50 crc kubenswrapper[4974]: I1013 18:33:50.913167 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 18:33:50 crc kubenswrapper[4974]: I1013 18:33:50.943154 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 18:33:50 crc kubenswrapper[4974]: I1013 18:33:50.943223 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 18:33:50 crc kubenswrapper[4974]: I1013 18:33:50.963738 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.037361523 podStartE2EDuration="4.963713542s" podCreationTimestamp="2025-10-13 18:33:46 +0000 UTC" firstStartedPulling="2025-10-13 18:33:47.085388377 +0000 UTC m=+1161.989754457" lastFinishedPulling="2025-10-13 18:33:50.011740386 +0000 UTC m=+1164.916106476" observedRunningTime="2025-10-13 18:33:50.959063903 +0000 UTC m=+1165.863429983" watchObservedRunningTime="2025-10-13 18:33:50.963713542 +0000 UTC m=+1165.868079622" Oct 13 18:33:50 crc kubenswrapper[4974]: I1013 18:33:50.976299 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 13 18:33:50 crc kubenswrapper[4974]: I1013 18:33:50.976347 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.008627 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.213954 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.238837 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.308240 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dd5946f99-q7kth"] Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.308519 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" podUID="33807b55-8fc9-44dc-9639-b633bd748101" containerName="dnsmasq-dns" containerID="cri-o://36fe83fb2a3be1c32d1dbc57d613e58f31a369a15a541a17891b27354a1de5e1" gracePeriod=10 Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.923719 4974 generic.go:334] "Generic (PLEG): container finished" podID="33807b55-8fc9-44dc-9639-b633bd748101" containerID="36fe83fb2a3be1c32d1dbc57d613e58f31a369a15a541a17891b27354a1de5e1" exitCode=0 Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.924199 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" event={"ID":"33807b55-8fc9-44dc-9639-b633bd748101","Type":"ContainerDied","Data":"36fe83fb2a3be1c32d1dbc57d613e58f31a369a15a541a17891b27354a1de5e1"} Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.924226 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" event={"ID":"33807b55-8fc9-44dc-9639-b633bd748101","Type":"ContainerDied","Data":"47e5f8cb902205fb77a7f4c4c551bb7a6df12a6648997cd78370f29aaa374cbd"} Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.924236 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="47e5f8cb902205fb77a7f4c4c551bb7a6df12a6648997cd78370f29aaa374cbd" Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.932717 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.965354 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md9dr\" (UniqueName: \"kubernetes.io/projected/33807b55-8fc9-44dc-9639-b633bd748101-kube-api-access-md9dr\") pod \"33807b55-8fc9-44dc-9639-b633bd748101\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.965459 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-dns-svc\") pod \"33807b55-8fc9-44dc-9639-b633bd748101\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.965532 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-ovsdbserver-nb\") pod \"33807b55-8fc9-44dc-9639-b633bd748101\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.965595 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-config\") pod \"33807b55-8fc9-44dc-9639-b633bd748101\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.965696 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-dns-swift-storage-0\") pod \"33807b55-8fc9-44dc-9639-b633bd748101\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.965714 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-ovsdbserver-sb\") pod \"33807b55-8fc9-44dc-9639-b633bd748101\" (UID: \"33807b55-8fc9-44dc-9639-b633bd748101\") " Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.981227 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.993819 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="76d80de1-0221-4438-9300-b1ac5c3b286b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 18:33:51 crc kubenswrapper[4974]: I1013 18:33:51.994132 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="76d80de1-0221-4438-9300-b1ac5c3b286b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.017929 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33807b55-8fc9-44dc-9639-b633bd748101-kube-api-access-md9dr" (OuterVolumeSpecName: "kube-api-access-md9dr") pod "33807b55-8fc9-44dc-9639-b633bd748101" (UID: "33807b55-8fc9-44dc-9639-b633bd748101"). InnerVolumeSpecName "kube-api-access-md9dr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.063817 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "33807b55-8fc9-44dc-9639-b633bd748101" (UID: "33807b55-8fc9-44dc-9639-b633bd748101"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.083814 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.083869 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md9dr\" (UniqueName: \"kubernetes.io/projected/33807b55-8fc9-44dc-9639-b633bd748101-kube-api-access-md9dr\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.084422 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "33807b55-8fc9-44dc-9639-b633bd748101" (UID: "33807b55-8fc9-44dc-9639-b633bd748101"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.100131 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "33807b55-8fc9-44dc-9639-b633bd748101" (UID: "33807b55-8fc9-44dc-9639-b633bd748101"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.122397 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "33807b55-8fc9-44dc-9639-b633bd748101" (UID: "33807b55-8fc9-44dc-9639-b633bd748101"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.129538 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-config" (OuterVolumeSpecName: "config") pod "33807b55-8fc9-44dc-9639-b633bd748101" (UID: "33807b55-8fc9-44dc-9639-b633bd748101"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.185576 4974 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.185619 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.185632 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.185643 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33807b55-8fc9-44dc-9639-b633bd748101-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.316019 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.389258 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-combined-ca-bundle\") pod \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.389300 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82q4w\" (UniqueName: \"kubernetes.io/projected/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-kube-api-access-82q4w\") pod \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.389363 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-scripts\") pod \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.389486 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-config-data\") pod \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\" (UID: \"2f00ec4d-7131-4fbd-9077-53b53ea0abc1\") " Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.392987 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-kube-api-access-82q4w" (OuterVolumeSpecName: "kube-api-access-82q4w") pod "2f00ec4d-7131-4fbd-9077-53b53ea0abc1" (UID: "2f00ec4d-7131-4fbd-9077-53b53ea0abc1"). InnerVolumeSpecName "kube-api-access-82q4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.420845 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-scripts" (OuterVolumeSpecName: "scripts") pod "2f00ec4d-7131-4fbd-9077-53b53ea0abc1" (UID: "2f00ec4d-7131-4fbd-9077-53b53ea0abc1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.428822 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-config-data" (OuterVolumeSpecName: "config-data") pod "2f00ec4d-7131-4fbd-9077-53b53ea0abc1" (UID: "2f00ec4d-7131-4fbd-9077-53b53ea0abc1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.460737 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f00ec4d-7131-4fbd-9077-53b53ea0abc1" (UID: "2f00ec4d-7131-4fbd-9077-53b53ea0abc1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.491666 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.491707 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.491722 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82q4w\" (UniqueName: \"kubernetes.io/projected/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-kube-api-access-82q4w\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.491733 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2f00ec4d-7131-4fbd-9077-53b53ea0abc1-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.933165 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-5q4sp" event={"ID":"2f00ec4d-7131-4fbd-9077-53b53ea0abc1","Type":"ContainerDied","Data":"5db7662fb08af936c5a5bbe1f27b74fd70ee38530475f4b6139c888627331d9d"} Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.933449 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5db7662fb08af936c5a5bbe1f27b74fd70ee38530475f4b6139c888627331d9d" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.933206 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-5q4sp" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.933197 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd5946f99-q7kth" Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.978991 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dd5946f99-q7kth"] Oct 13 18:33:52 crc kubenswrapper[4974]: I1013 18:33:52.987436 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dd5946f99-q7kth"] Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.052306 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.052531 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="76d80de1-0221-4438-9300-b1ac5c3b286b" containerName="nova-api-log" containerID="cri-o://97e1cdd995828883ad1d9be74d671d7f5f85484183dbe27b3b62eeff9c21ada4" gracePeriod=30 Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.052621 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="76d80de1-0221-4438-9300-b1ac5c3b286b" containerName="nova-api-api" containerID="cri-o://9f96a353f10adc6fd44eeb6c50ea1200eeba6435a1a14877d430fa7b5a95da06" gracePeriod=30 Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.082891 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.083099 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="53157d8c-7fed-48de-9714-cab91461a8f7" containerName="nova-metadata-log" containerID="cri-o://bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093" gracePeriod=30 Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.083253 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="53157d8c-7fed-48de-9714-cab91461a8f7" containerName="nova-metadata-metadata" containerID="cri-o://37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f" gracePeriod=30 Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.258957 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.259010 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.260134 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.798317 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.814671 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-nova-metadata-tls-certs\") pod \"53157d8c-7fed-48de-9714-cab91461a8f7\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.814754 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53157d8c-7fed-48de-9714-cab91461a8f7-logs\") pod \"53157d8c-7fed-48de-9714-cab91461a8f7\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.814860 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs8gh\" (UniqueName: \"kubernetes.io/projected/53157d8c-7fed-48de-9714-cab91461a8f7-kube-api-access-gs8gh\") pod \"53157d8c-7fed-48de-9714-cab91461a8f7\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.814975 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-combined-ca-bundle\") pod \"53157d8c-7fed-48de-9714-cab91461a8f7\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.815056 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-config-data\") pod \"53157d8c-7fed-48de-9714-cab91461a8f7\" (UID: \"53157d8c-7fed-48de-9714-cab91461a8f7\") " Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.815230 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53157d8c-7fed-48de-9714-cab91461a8f7-logs" (OuterVolumeSpecName: "logs") pod "53157d8c-7fed-48de-9714-cab91461a8f7" (UID: "53157d8c-7fed-48de-9714-cab91461a8f7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.815867 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/53157d8c-7fed-48de-9714-cab91461a8f7-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.822821 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53157d8c-7fed-48de-9714-cab91461a8f7-kube-api-access-gs8gh" (OuterVolumeSpecName: "kube-api-access-gs8gh") pod "53157d8c-7fed-48de-9714-cab91461a8f7" (UID: "53157d8c-7fed-48de-9714-cab91461a8f7"). InnerVolumeSpecName "kube-api-access-gs8gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.827404 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33807b55-8fc9-44dc-9639-b633bd748101" path="/var/lib/kubelet/pods/33807b55-8fc9-44dc-9639-b633bd748101/volumes" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.867817 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-config-data" (OuterVolumeSpecName: "config-data") pod "53157d8c-7fed-48de-9714-cab91461a8f7" (UID: "53157d8c-7fed-48de-9714-cab91461a8f7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.867862 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "53157d8c-7fed-48de-9714-cab91461a8f7" (UID: "53157d8c-7fed-48de-9714-cab91461a8f7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.901737 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "53157d8c-7fed-48de-9714-cab91461a8f7" (UID: "53157d8c-7fed-48de-9714-cab91461a8f7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.918490 4974 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.918528 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs8gh\" (UniqueName: \"kubernetes.io/projected/53157d8c-7fed-48de-9714-cab91461a8f7-kube-api-access-gs8gh\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.918538 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.918549 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53157d8c-7fed-48de-9714-cab91461a8f7-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.944239 4974 generic.go:334] "Generic (PLEG): container finished" podID="53157d8c-7fed-48de-9714-cab91461a8f7" containerID="37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f" exitCode=0 Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.944486 4974 generic.go:334] "Generic (PLEG): container finished" podID="53157d8c-7fed-48de-9714-cab91461a8f7" containerID="bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093" exitCode=143 Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.944343 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.946291 4974 generic.go:334] "Generic (PLEG): container finished" podID="76d80de1-0221-4438-9300-b1ac5c3b286b" containerID="97e1cdd995828883ad1d9be74d671d7f5f85484183dbe27b3b62eeff9c21ada4" exitCode=143 Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.947908 4974 generic.go:334] "Generic (PLEG): container finished" podID="e3edde18-d8b7-4d74-a226-2078fb905c13" containerID="c54198f199657ad749421ec9dd41c988fc735d4998cf1446050d644c4397ef0e" exitCode=0 Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.948068 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="502aab84-7155-4fcd-aecc-ada0de65f5d2" containerName="nova-scheduler-scheduler" containerID="cri-o://96cd22203fba9524159bc61ad032f605190b0b5f60e6571e95d8d40617d712f5" gracePeriod=30 Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.949633 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53157d8c-7fed-48de-9714-cab91461a8f7","Type":"ContainerDied","Data":"37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f"} Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.949676 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53157d8c-7fed-48de-9714-cab91461a8f7","Type":"ContainerDied","Data":"bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093"} Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.949687 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"53157d8c-7fed-48de-9714-cab91461a8f7","Type":"ContainerDied","Data":"451f44470147663ddf6904202559d513a74532b4036e575ad100247046cfeec4"} Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.949694 4974 scope.go:117] "RemoveContainer" containerID="37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.949696 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76d80de1-0221-4438-9300-b1ac5c3b286b","Type":"ContainerDied","Data":"97e1cdd995828883ad1d9be74d671d7f5f85484183dbe27b3b62eeff9c21ada4"} Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.949784 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-w4zh2" event={"ID":"e3edde18-d8b7-4d74-a226-2078fb905c13","Type":"ContainerDied","Data":"c54198f199657ad749421ec9dd41c988fc735d4998cf1446050d644c4397ef0e"} Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.971966 4974 scope.go:117] "RemoveContainer" containerID="bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.993643 4974 scope.go:117] "RemoveContainer" containerID="37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f" Oct 13 18:33:53 crc kubenswrapper[4974]: E1013 18:33:53.994287 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f\": container with ID starting with 37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f not found: ID does not exist" containerID="37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.994328 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f"} err="failed to get container status \"37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f\": rpc error: code = NotFound desc = could not find container \"37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f\": container with ID starting with 37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f not found: ID does not exist" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.994355 4974 scope.go:117] "RemoveContainer" containerID="bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.994478 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:53 crc kubenswrapper[4974]: E1013 18:33:53.994704 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093\": container with ID starting with bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093 not found: ID does not exist" containerID="bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.994737 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093"} err="failed to get container status \"bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093\": rpc error: code = NotFound desc = could not find container \"bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093\": container with ID starting with bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093 not found: ID does not exist" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.994761 4974 scope.go:117] "RemoveContainer" containerID="37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.995138 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f"} err="failed to get container status \"37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f\": rpc error: code = NotFound desc = could not find container \"37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f\": container with ID starting with 37f3998f4a58051a618c279e84e890c94695630f75d5e3a0d1ad602be6bede3f not found: ID does not exist" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.995164 4974 scope.go:117] "RemoveContainer" containerID="bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093" Oct 13 18:33:53 crc kubenswrapper[4974]: I1013 18:33:53.996314 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093"} err="failed to get container status \"bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093\": rpc error: code = NotFound desc = could not find container \"bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093\": container with ID starting with bb41946cade053bcaf576ee5fe76d0a4f16d7c0809d6698eccc1346b2fac6093 not found: ID does not exist" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.007510 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.017764 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:54 crc kubenswrapper[4974]: E1013 18:33:54.018201 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33807b55-8fc9-44dc-9639-b633bd748101" containerName="init" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.018215 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="33807b55-8fc9-44dc-9639-b633bd748101" containerName="init" Oct 13 18:33:54 crc kubenswrapper[4974]: E1013 18:33:54.018239 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53157d8c-7fed-48de-9714-cab91461a8f7" containerName="nova-metadata-log" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.018245 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="53157d8c-7fed-48de-9714-cab91461a8f7" containerName="nova-metadata-log" Oct 13 18:33:54 crc kubenswrapper[4974]: E1013 18:33:54.018256 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33807b55-8fc9-44dc-9639-b633bd748101" containerName="dnsmasq-dns" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.018261 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="33807b55-8fc9-44dc-9639-b633bd748101" containerName="dnsmasq-dns" Oct 13 18:33:54 crc kubenswrapper[4974]: E1013 18:33:54.018280 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f00ec4d-7131-4fbd-9077-53b53ea0abc1" containerName="nova-manage" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.018285 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f00ec4d-7131-4fbd-9077-53b53ea0abc1" containerName="nova-manage" Oct 13 18:33:54 crc kubenswrapper[4974]: E1013 18:33:54.018301 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53157d8c-7fed-48de-9714-cab91461a8f7" containerName="nova-metadata-metadata" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.018307 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="53157d8c-7fed-48de-9714-cab91461a8f7" containerName="nova-metadata-metadata" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.018481 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="33807b55-8fc9-44dc-9639-b633bd748101" containerName="dnsmasq-dns" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.018493 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="53157d8c-7fed-48de-9714-cab91461a8f7" containerName="nova-metadata-log" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.018510 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f00ec4d-7131-4fbd-9077-53b53ea0abc1" containerName="nova-manage" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.018524 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="53157d8c-7fed-48de-9714-cab91461a8f7" containerName="nova-metadata-metadata" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.031335 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.031435 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.035973 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.036167 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.122512 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msxsn\" (UniqueName: \"kubernetes.io/projected/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-kube-api-access-msxsn\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.122642 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-config-data\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.122725 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-logs\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.122933 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.123131 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.224710 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.224824 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.224860 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msxsn\" (UniqueName: \"kubernetes.io/projected/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-kube-api-access-msxsn\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.224913 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-config-data\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.224937 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-logs\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.225302 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-logs\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.230455 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.231374 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.231640 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-config-data\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.246675 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msxsn\" (UniqueName: \"kubernetes.io/projected/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-kube-api-access-msxsn\") pod \"nova-metadata-0\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.359644 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.855074 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:33:54 crc kubenswrapper[4974]: I1013 18:33:54.959286 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f","Type":"ContainerStarted","Data":"9d1e31ecacae10e82a1049317a4503ba8949a1bed055bb6f9ebf349c1d2a6cc4"} Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.270586 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.351631 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-combined-ca-bundle\") pod \"e3edde18-d8b7-4d74-a226-2078fb905c13\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.351701 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z8nx\" (UniqueName: \"kubernetes.io/projected/e3edde18-d8b7-4d74-a226-2078fb905c13-kube-api-access-8z8nx\") pod \"e3edde18-d8b7-4d74-a226-2078fb905c13\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.351752 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-scripts\") pod \"e3edde18-d8b7-4d74-a226-2078fb905c13\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.351870 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-config-data\") pod \"e3edde18-d8b7-4d74-a226-2078fb905c13\" (UID: \"e3edde18-d8b7-4d74-a226-2078fb905c13\") " Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.356220 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3edde18-d8b7-4d74-a226-2078fb905c13-kube-api-access-8z8nx" (OuterVolumeSpecName: "kube-api-access-8z8nx") pod "e3edde18-d8b7-4d74-a226-2078fb905c13" (UID: "e3edde18-d8b7-4d74-a226-2078fb905c13"). InnerVolumeSpecName "kube-api-access-8z8nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.356685 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-scripts" (OuterVolumeSpecName: "scripts") pod "e3edde18-d8b7-4d74-a226-2078fb905c13" (UID: "e3edde18-d8b7-4d74-a226-2078fb905c13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.382261 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3edde18-d8b7-4d74-a226-2078fb905c13" (UID: "e3edde18-d8b7-4d74-a226-2078fb905c13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.390908 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-config-data" (OuterVolumeSpecName: "config-data") pod "e3edde18-d8b7-4d74-a226-2078fb905c13" (UID: "e3edde18-d8b7-4d74-a226-2078fb905c13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.453865 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.453901 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z8nx\" (UniqueName: \"kubernetes.io/projected/e3edde18-d8b7-4d74-a226-2078fb905c13-kube-api-access-8z8nx\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.453914 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.453923 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3edde18-d8b7-4d74-a226-2078fb905c13-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.821411 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53157d8c-7fed-48de-9714-cab91461a8f7" path="/var/lib/kubelet/pods/53157d8c-7fed-48de-9714-cab91461a8f7/volumes" Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.972867 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f","Type":"ContainerStarted","Data":"316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee"} Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.974073 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f","Type":"ContainerStarted","Data":"b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3"} Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.974783 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-w4zh2" event={"ID":"e3edde18-d8b7-4d74-a226-2078fb905c13","Type":"ContainerDied","Data":"ab7aa7c3cd510cda40a893bee144075f1288af0694b1399cf09bd7380565795b"} Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.975020 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab7aa7c3cd510cda40a893bee144075f1288af0694b1399cf09bd7380565795b" Oct 13 18:33:55 crc kubenswrapper[4974]: I1013 18:33:55.974886 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-w4zh2" Oct 13 18:33:55 crc kubenswrapper[4974]: E1013 18:33:55.978004 4974 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="96cd22203fba9524159bc61ad032f605190b0b5f60e6571e95d8d40617d712f5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 18:33:55 crc kubenswrapper[4974]: E1013 18:33:55.980248 4974 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="96cd22203fba9524159bc61ad032f605190b0b5f60e6571e95d8d40617d712f5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 18:33:55 crc kubenswrapper[4974]: E1013 18:33:55.983117 4974 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="96cd22203fba9524159bc61ad032f605190b0b5f60e6571e95d8d40617d712f5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 18:33:55 crc kubenswrapper[4974]: E1013 18:33:55.983193 4974 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="502aab84-7155-4fcd-aecc-ada0de65f5d2" containerName="nova-scheduler-scheduler" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.006549 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.006524855 podStartE2EDuration="3.006524855s" podCreationTimestamp="2025-10-13 18:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:33:55.995481178 +0000 UTC m=+1170.899847318" watchObservedRunningTime="2025-10-13 18:33:56.006524855 +0000 UTC m=+1170.910890955" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.037478 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 18:33:56 crc kubenswrapper[4974]: E1013 18:33:56.038411 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3edde18-d8b7-4d74-a226-2078fb905c13" containerName="nova-cell1-conductor-db-sync" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.038431 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3edde18-d8b7-4d74-a226-2078fb905c13" containerName="nova-cell1-conductor-db-sync" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.038615 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3edde18-d8b7-4d74-a226-2078fb905c13" containerName="nova-cell1-conductor-db-sync" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.039288 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.059980 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.079105 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.168720 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004c8db1-b15c-43d1-b988-92d779aaebb2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"004c8db1-b15c-43d1-b988-92d779aaebb2\") " pod="openstack/nova-cell1-conductor-0" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.168809 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004c8db1-b15c-43d1-b988-92d779aaebb2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"004c8db1-b15c-43d1-b988-92d779aaebb2\") " pod="openstack/nova-cell1-conductor-0" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.168987 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5clmr\" (UniqueName: \"kubernetes.io/projected/004c8db1-b15c-43d1-b988-92d779aaebb2-kube-api-access-5clmr\") pod \"nova-cell1-conductor-0\" (UID: \"004c8db1-b15c-43d1-b988-92d779aaebb2\") " pod="openstack/nova-cell1-conductor-0" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.271610 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004c8db1-b15c-43d1-b988-92d779aaebb2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"004c8db1-b15c-43d1-b988-92d779aaebb2\") " pod="openstack/nova-cell1-conductor-0" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.271901 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004c8db1-b15c-43d1-b988-92d779aaebb2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"004c8db1-b15c-43d1-b988-92d779aaebb2\") " pod="openstack/nova-cell1-conductor-0" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.272154 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5clmr\" (UniqueName: \"kubernetes.io/projected/004c8db1-b15c-43d1-b988-92d779aaebb2-kube-api-access-5clmr\") pod \"nova-cell1-conductor-0\" (UID: \"004c8db1-b15c-43d1-b988-92d779aaebb2\") " pod="openstack/nova-cell1-conductor-0" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.282776 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/004c8db1-b15c-43d1-b988-92d779aaebb2-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"004c8db1-b15c-43d1-b988-92d779aaebb2\") " pod="openstack/nova-cell1-conductor-0" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.283071 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/004c8db1-b15c-43d1-b988-92d779aaebb2-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"004c8db1-b15c-43d1-b988-92d779aaebb2\") " pod="openstack/nova-cell1-conductor-0" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.295206 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5clmr\" (UniqueName: \"kubernetes.io/projected/004c8db1-b15c-43d1-b988-92d779aaebb2-kube-api-access-5clmr\") pod \"nova-cell1-conductor-0\" (UID: \"004c8db1-b15c-43d1-b988-92d779aaebb2\") " pod="openstack/nova-cell1-conductor-0" Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.382000 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Oct 13 18:33:56 crc kubenswrapper[4974]: W1013 18:33:56.855785 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod004c8db1_b15c_43d1_b988_92d779aaebb2.slice/crio-7b3d9cf17664754bca8a7331199261d0783d08f875dfc76545d35ce3938f1c04 WatchSource:0}: Error finding container 7b3d9cf17664754bca8a7331199261d0783d08f875dfc76545d35ce3938f1c04: Status 404 returned error can't find the container with id 7b3d9cf17664754bca8a7331199261d0783d08f875dfc76545d35ce3938f1c04 Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.855865 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.989668 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"004c8db1-b15c-43d1-b988-92d779aaebb2","Type":"ContainerStarted","Data":"7b3d9cf17664754bca8a7331199261d0783d08f875dfc76545d35ce3938f1c04"} Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.993538 4974 generic.go:334] "Generic (PLEG): container finished" podID="76d80de1-0221-4438-9300-b1ac5c3b286b" containerID="9f96a353f10adc6fd44eeb6c50ea1200eeba6435a1a14877d430fa7b5a95da06" exitCode=0 Oct 13 18:33:56 crc kubenswrapper[4974]: I1013 18:33:56.993563 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76d80de1-0221-4438-9300-b1ac5c3b286b","Type":"ContainerDied","Data":"9f96a353f10adc6fd44eeb6c50ea1200eeba6435a1a14877d430fa7b5a95da06"} Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.342176 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.390889 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76d80de1-0221-4438-9300-b1ac5c3b286b-logs\") pod \"76d80de1-0221-4438-9300-b1ac5c3b286b\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.390987 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d80de1-0221-4438-9300-b1ac5c3b286b-config-data\") pod \"76d80de1-0221-4438-9300-b1ac5c3b286b\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.391053 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndpmq\" (UniqueName: \"kubernetes.io/projected/76d80de1-0221-4438-9300-b1ac5c3b286b-kube-api-access-ndpmq\") pod \"76d80de1-0221-4438-9300-b1ac5c3b286b\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.391087 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d80de1-0221-4438-9300-b1ac5c3b286b-combined-ca-bundle\") pod \"76d80de1-0221-4438-9300-b1ac5c3b286b\" (UID: \"76d80de1-0221-4438-9300-b1ac5c3b286b\") " Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.392018 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76d80de1-0221-4438-9300-b1ac5c3b286b-logs" (OuterVolumeSpecName: "logs") pod "76d80de1-0221-4438-9300-b1ac5c3b286b" (UID: "76d80de1-0221-4438-9300-b1ac5c3b286b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.395969 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76d80de1-0221-4438-9300-b1ac5c3b286b-kube-api-access-ndpmq" (OuterVolumeSpecName: "kube-api-access-ndpmq") pod "76d80de1-0221-4438-9300-b1ac5c3b286b" (UID: "76d80de1-0221-4438-9300-b1ac5c3b286b"). InnerVolumeSpecName "kube-api-access-ndpmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.420887 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d80de1-0221-4438-9300-b1ac5c3b286b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "76d80de1-0221-4438-9300-b1ac5c3b286b" (UID: "76d80de1-0221-4438-9300-b1ac5c3b286b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.421335 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76d80de1-0221-4438-9300-b1ac5c3b286b-config-data" (OuterVolumeSpecName: "config-data") pod "76d80de1-0221-4438-9300-b1ac5c3b286b" (UID: "76d80de1-0221-4438-9300-b1ac5c3b286b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.493542 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/76d80de1-0221-4438-9300-b1ac5c3b286b-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.493577 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/76d80de1-0221-4438-9300-b1ac5c3b286b-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.493589 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndpmq\" (UniqueName: \"kubernetes.io/projected/76d80de1-0221-4438-9300-b1ac5c3b286b-kube-api-access-ndpmq\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:57 crc kubenswrapper[4974]: I1013 18:33:57.493598 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/76d80de1-0221-4438-9300-b1ac5c3b286b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.005719 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"004c8db1-b15c-43d1-b988-92d779aaebb2","Type":"ContainerStarted","Data":"86d84207ce525bce023da27317a086750b16f1fbbb54896e6b6f053e60a5d923"} Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.006262 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.007880 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.007983 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"76d80de1-0221-4438-9300-b1ac5c3b286b","Type":"ContainerDied","Data":"57436b4028c9ff641e196b39a19110dfd11492a207f847cb77d8c86e9469e3c2"} Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.008036 4974 scope.go:117] "RemoveContainer" containerID="9f96a353f10adc6fd44eeb6c50ea1200eeba6435a1a14877d430fa7b5a95da06" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.027622 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.027598838 podStartE2EDuration="2.027598838s" podCreationTimestamp="2025-10-13 18:33:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:33:58.022991589 +0000 UTC m=+1172.927357689" watchObservedRunningTime="2025-10-13 18:33:58.027598838 +0000 UTC m=+1172.931964948" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.045280 4974 scope.go:117] "RemoveContainer" containerID="97e1cdd995828883ad1d9be74d671d7f5f85484183dbe27b3b62eeff9c21ada4" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.074724 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.087622 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.106735 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 18:33:58 crc kubenswrapper[4974]: E1013 18:33:58.107276 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d80de1-0221-4438-9300-b1ac5c3b286b" containerName="nova-api-log" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.107295 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d80de1-0221-4438-9300-b1ac5c3b286b" containerName="nova-api-log" Oct 13 18:33:58 crc kubenswrapper[4974]: E1013 18:33:58.107319 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76d80de1-0221-4438-9300-b1ac5c3b286b" containerName="nova-api-api" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.107327 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="76d80de1-0221-4438-9300-b1ac5c3b286b" containerName="nova-api-api" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.107562 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d80de1-0221-4438-9300-b1ac5c3b286b" containerName="nova-api-log" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.107595 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="76d80de1-0221-4438-9300-b1ac5c3b286b" containerName="nova-api-api" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.108985 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.114335 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.118256 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.214946 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-logs\") pod \"nova-api-0\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.215059 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.215123 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-config-data\") pod \"nova-api-0\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.215166 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs5pn\" (UniqueName: \"kubernetes.io/projected/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-kube-api-access-gs5pn\") pod \"nova-api-0\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.317174 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-logs\") pod \"nova-api-0\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.318516 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.318711 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-config-data\") pod \"nova-api-0\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.319505 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs5pn\" (UniqueName: \"kubernetes.io/projected/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-kube-api-access-gs5pn\") pod \"nova-api-0\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.318392 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-logs\") pod \"nova-api-0\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.322306 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.329946 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-config-data\") pod \"nova-api-0\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.336452 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs5pn\" (UniqueName: \"kubernetes.io/projected/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-kube-api-access-gs5pn\") pod \"nova-api-0\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " pod="openstack/nova-api-0" Oct 13 18:33:58 crc kubenswrapper[4974]: I1013 18:33:58.436317 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:58.928793 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.019810 4974 generic.go:334] "Generic (PLEG): container finished" podID="502aab84-7155-4fcd-aecc-ada0de65f5d2" containerID="96cd22203fba9524159bc61ad032f605190b0b5f60e6571e95d8d40617d712f5" exitCode=0 Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.019868 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"502aab84-7155-4fcd-aecc-ada0de65f5d2","Type":"ContainerDied","Data":"96cd22203fba9524159bc61ad032f605190b0b5f60e6571e95d8d40617d712f5"} Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.019891 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"502aab84-7155-4fcd-aecc-ada0de65f5d2","Type":"ContainerDied","Data":"e462cea78cbf4d77dff7a66d6cf38597175583b235ec775f1f4b87a2d2eb234c"} Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.019902 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e462cea78cbf4d77dff7a66d6cf38597175583b235ec775f1f4b87a2d2eb234c" Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.022730 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2","Type":"ContainerStarted","Data":"5af436d632b0ed3210d9cb46af895875d8b6a4331eec534942118bb4130b89dc"} Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.064927 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.134037 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502aab84-7155-4fcd-aecc-ada0de65f5d2-combined-ca-bundle\") pod \"502aab84-7155-4fcd-aecc-ada0de65f5d2\" (UID: \"502aab84-7155-4fcd-aecc-ada0de65f5d2\") " Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.134098 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqwbz\" (UniqueName: \"kubernetes.io/projected/502aab84-7155-4fcd-aecc-ada0de65f5d2-kube-api-access-bqwbz\") pod \"502aab84-7155-4fcd-aecc-ada0de65f5d2\" (UID: \"502aab84-7155-4fcd-aecc-ada0de65f5d2\") " Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.134317 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502aab84-7155-4fcd-aecc-ada0de65f5d2-config-data\") pod \"502aab84-7155-4fcd-aecc-ada0de65f5d2\" (UID: \"502aab84-7155-4fcd-aecc-ada0de65f5d2\") " Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.140892 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/502aab84-7155-4fcd-aecc-ada0de65f5d2-kube-api-access-bqwbz" (OuterVolumeSpecName: "kube-api-access-bqwbz") pod "502aab84-7155-4fcd-aecc-ada0de65f5d2" (UID: "502aab84-7155-4fcd-aecc-ada0de65f5d2"). InnerVolumeSpecName "kube-api-access-bqwbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.165885 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502aab84-7155-4fcd-aecc-ada0de65f5d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "502aab84-7155-4fcd-aecc-ada0de65f5d2" (UID: "502aab84-7155-4fcd-aecc-ada0de65f5d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.178756 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502aab84-7155-4fcd-aecc-ada0de65f5d2-config-data" (OuterVolumeSpecName: "config-data") pod "502aab84-7155-4fcd-aecc-ada0de65f5d2" (UID: "502aab84-7155-4fcd-aecc-ada0de65f5d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.236506 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/502aab84-7155-4fcd-aecc-ada0de65f5d2-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.236537 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/502aab84-7155-4fcd-aecc-ada0de65f5d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.236549 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqwbz\" (UniqueName: \"kubernetes.io/projected/502aab84-7155-4fcd-aecc-ada0de65f5d2-kube-api-access-bqwbz\") on node \"crc\" DevicePath \"\"" Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.367740 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.367984 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 18:33:59 crc kubenswrapper[4974]: I1013 18:33:59.824103 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76d80de1-0221-4438-9300-b1ac5c3b286b" path="/var/lib/kubelet/pods/76d80de1-0221-4438-9300-b1ac5c3b286b/volumes" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.034477 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2","Type":"ContainerStarted","Data":"f54ae4dbb8a52d2adfd595700cd49cb809ad589947325d3b2d8bb53908dfb377"} Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.034498 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.034521 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2","Type":"ContainerStarted","Data":"45d13f3258b9ae888d48e66a8c2cdb9c6913d0bd762bee40b7c703f05413edc6"} Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.075678 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.075646849 podStartE2EDuration="2.075646849s" podCreationTimestamp="2025-10-13 18:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:34:00.069428296 +0000 UTC m=+1174.973794386" watchObservedRunningTime="2025-10-13 18:34:00.075646849 +0000 UTC m=+1174.980012929" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.104488 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.120577 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.132293 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:34:00 crc kubenswrapper[4974]: E1013 18:34:00.133036 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502aab84-7155-4fcd-aecc-ada0de65f5d2" containerName="nova-scheduler-scheduler" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.133062 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="502aab84-7155-4fcd-aecc-ada0de65f5d2" containerName="nova-scheduler-scheduler" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.133534 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="502aab84-7155-4fcd-aecc-ada0de65f5d2" containerName="nova-scheduler-scheduler" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.134497 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.142623 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.173982 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.194992 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-config-data\") pod \"nova-scheduler-0\" (UID: \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.195230 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qd9v\" (UniqueName: \"kubernetes.io/projected/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-kube-api-access-8qd9v\") pod \"nova-scheduler-0\" (UID: \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.195327 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.297598 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-config-data\") pod \"nova-scheduler-0\" (UID: \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.297938 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qd9v\" (UniqueName: \"kubernetes.io/projected/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-kube-api-access-8qd9v\") pod \"nova-scheduler-0\" (UID: \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.298196 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.303933 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.304021 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-config-data\") pod \"nova-scheduler-0\" (UID: \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.313030 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qd9v\" (UniqueName: \"kubernetes.io/projected/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-kube-api-access-8qd9v\") pod \"nova-scheduler-0\" (UID: \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.494765 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 18:34:00 crc kubenswrapper[4974]: I1013 18:34:00.981262 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:34:00 crc kubenswrapper[4974]: W1013 18:34:00.984471 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbb2ba38_77ca_4954_998c_ca99c34e0b3c.slice/crio-339487af903fd03e84072d9ebf8e1417d4ee4caee501a555b9ff3221897a07c6 WatchSource:0}: Error finding container 339487af903fd03e84072d9ebf8e1417d4ee4caee501a555b9ff3221897a07c6: Status 404 returned error can't find the container with id 339487af903fd03e84072d9ebf8e1417d4ee4caee501a555b9ff3221897a07c6 Oct 13 18:34:01 crc kubenswrapper[4974]: I1013 18:34:01.048684 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbb2ba38-77ca-4954-998c-ca99c34e0b3c","Type":"ContainerStarted","Data":"339487af903fd03e84072d9ebf8e1417d4ee4caee501a555b9ff3221897a07c6"} Oct 13 18:34:01 crc kubenswrapper[4974]: I1013 18:34:01.843280 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="502aab84-7155-4fcd-aecc-ada0de65f5d2" path="/var/lib/kubelet/pods/502aab84-7155-4fcd-aecc-ada0de65f5d2/volumes" Oct 13 18:34:02 crc kubenswrapper[4974]: I1013 18:34:02.063444 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbb2ba38-77ca-4954-998c-ca99c34e0b3c","Type":"ContainerStarted","Data":"3e2285772d7905b9bda0126dc94d24d559183db00d7b16cfb8c16c8db10d82a4"} Oct 13 18:34:02 crc kubenswrapper[4974]: I1013 18:34:02.095624 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.09560119 podStartE2EDuration="2.09560119s" podCreationTimestamp="2025-10-13 18:34:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:34:02.082075834 +0000 UTC m=+1176.986441954" watchObservedRunningTime="2025-10-13 18:34:02.09560119 +0000 UTC m=+1176.999967290" Oct 13 18:34:04 crc kubenswrapper[4974]: I1013 18:34:04.367118 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 18:34:04 crc kubenswrapper[4974]: I1013 18:34:04.367649 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 18:34:05 crc kubenswrapper[4974]: I1013 18:34:05.387895 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 18:34:05 crc kubenswrapper[4974]: I1013 18:34:05.387897 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 18:34:05 crc kubenswrapper[4974]: I1013 18:34:05.495391 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 13 18:34:06 crc kubenswrapper[4974]: I1013 18:34:06.417032 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Oct 13 18:34:08 crc kubenswrapper[4974]: I1013 18:34:08.438382 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 18:34:08 crc kubenswrapper[4974]: I1013 18:34:08.438804 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 18:34:09 crc kubenswrapper[4974]: I1013 18:34:09.521941 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.223:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 18:34:09 crc kubenswrapper[4974]: I1013 18:34:09.523626 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.223:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Oct 13 18:34:10 crc kubenswrapper[4974]: I1013 18:34:10.495753 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 13 18:34:10 crc kubenswrapper[4974]: I1013 18:34:10.545084 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 13 18:34:11 crc kubenswrapper[4974]: I1013 18:34:11.216618 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 13 18:34:14 crc kubenswrapper[4974]: I1013 18:34:14.377604 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 18:34:14 crc kubenswrapper[4974]: I1013 18:34:14.379267 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 18:34:14 crc kubenswrapper[4974]: I1013 18:34:14.401864 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 18:34:15 crc kubenswrapper[4974]: I1013 18:34:15.254300 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 18:34:16 crc kubenswrapper[4974]: I1013 18:34:16.541160 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.257392 4974 generic.go:334] "Generic (PLEG): container finished" podID="34bce935-6ffd-4772-bc4f-78eed1372e60" containerID="1241cb87cad276d93066d5ab9659eb3ac6b16d2c9d8fc4680822ca212cacbc47" exitCode=137 Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.257492 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"34bce935-6ffd-4772-bc4f-78eed1372e60","Type":"ContainerDied","Data":"1241cb87cad276d93066d5ab9659eb3ac6b16d2c9d8fc4680822ca212cacbc47"} Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.257704 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"34bce935-6ffd-4772-bc4f-78eed1372e60","Type":"ContainerDied","Data":"009f9997928a234e5cb3f1ad2bac6ac4810a3f2ffcb29eb6ae31d19f4eea410e"} Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.257722 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="009f9997928a234e5cb3f1ad2bac6ac4810a3f2ffcb29eb6ae31d19f4eea410e" Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.345303 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.392750 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bce935-6ffd-4772-bc4f-78eed1372e60-config-data\") pod \"34bce935-6ffd-4772-bc4f-78eed1372e60\" (UID: \"34bce935-6ffd-4772-bc4f-78eed1372e60\") " Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.392844 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fr2m\" (UniqueName: \"kubernetes.io/projected/34bce935-6ffd-4772-bc4f-78eed1372e60-kube-api-access-4fr2m\") pod \"34bce935-6ffd-4772-bc4f-78eed1372e60\" (UID: \"34bce935-6ffd-4772-bc4f-78eed1372e60\") " Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.393121 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bce935-6ffd-4772-bc4f-78eed1372e60-combined-ca-bundle\") pod \"34bce935-6ffd-4772-bc4f-78eed1372e60\" (UID: \"34bce935-6ffd-4772-bc4f-78eed1372e60\") " Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.404076 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34bce935-6ffd-4772-bc4f-78eed1372e60-kube-api-access-4fr2m" (OuterVolumeSpecName: "kube-api-access-4fr2m") pod "34bce935-6ffd-4772-bc4f-78eed1372e60" (UID: "34bce935-6ffd-4772-bc4f-78eed1372e60"). InnerVolumeSpecName "kube-api-access-4fr2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.429890 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bce935-6ffd-4772-bc4f-78eed1372e60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34bce935-6ffd-4772-bc4f-78eed1372e60" (UID: "34bce935-6ffd-4772-bc4f-78eed1372e60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.437564 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34bce935-6ffd-4772-bc4f-78eed1372e60-config-data" (OuterVolumeSpecName: "config-data") pod "34bce935-6ffd-4772-bc4f-78eed1372e60" (UID: "34bce935-6ffd-4772-bc4f-78eed1372e60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.496163 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34bce935-6ffd-4772-bc4f-78eed1372e60-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.496423 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fr2m\" (UniqueName: \"kubernetes.io/projected/34bce935-6ffd-4772-bc4f-78eed1372e60-kube-api-access-4fr2m\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:17 crc kubenswrapper[4974]: I1013 18:34:17.496536 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34bce935-6ffd-4772-bc4f-78eed1372e60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.268515 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.295716 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.303946 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.324331 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 18:34:18 crc kubenswrapper[4974]: E1013 18:34:18.325022 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34bce935-6ffd-4772-bc4f-78eed1372e60" containerName="nova-cell1-novncproxy-novncproxy" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.325114 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="34bce935-6ffd-4772-bc4f-78eed1372e60" containerName="nova-cell1-novncproxy-novncproxy" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.325428 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="34bce935-6ffd-4772-bc4f-78eed1372e60" containerName="nova-cell1-novncproxy-novncproxy" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.326335 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.328684 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.328969 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.332597 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.340584 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.412236 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.412327 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.412371 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjjtx\" (UniqueName: \"kubernetes.io/projected/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-kube-api-access-kjjtx\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.412399 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.412512 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.444114 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.444862 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.451041 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.452111 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.515291 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.515406 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.515448 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjjtx\" (UniqueName: \"kubernetes.io/projected/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-kube-api-access-kjjtx\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.515473 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.515524 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.520312 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.520529 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.523036 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.524234 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.540264 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjjtx\" (UniqueName: \"kubernetes.io/projected/86eb0f8b-123d-4c2a-b7c6-d0a613625ee8-kube-api-access-kjjtx\") pod \"nova-cell1-novncproxy-0\" (UID: \"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8\") " pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:18 crc kubenswrapper[4974]: I1013 18:34:18.653809 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.168945 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Oct 13 18:34:19 crc kubenswrapper[4974]: W1013 18:34:19.175861 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86eb0f8b_123d_4c2a_b7c6_d0a613625ee8.slice/crio-c5dd4549b3b298cd8f9b1a1a72c128b6d6aac5f245f5eae3420cbb493411a1a7 WatchSource:0}: Error finding container c5dd4549b3b298cd8f9b1a1a72c128b6d6aac5f245f5eae3420cbb493411a1a7: Status 404 returned error can't find the container with id c5dd4549b3b298cd8f9b1a1a72c128b6d6aac5f245f5eae3420cbb493411a1a7 Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.281979 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8","Type":"ContainerStarted","Data":"c5dd4549b3b298cd8f9b1a1a72c128b6d6aac5f245f5eae3420cbb493411a1a7"} Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.282298 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.287613 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.490462 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d8dc5545f-zxwtd"] Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.499427 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.518618 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d8dc5545f-zxwtd"] Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.541660 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-ovsdbserver-sb\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.541719 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-svc\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.541762 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-295r4\" (UniqueName: \"kubernetes.io/projected/c00496a5-8aaa-48e0-a156-31a35d2299bf-kube-api-access-295r4\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.541792 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-swift-storage-0\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.541848 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-ovsdbserver-nb\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.541878 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-config\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.646290 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-295r4\" (UniqueName: \"kubernetes.io/projected/c00496a5-8aaa-48e0-a156-31a35d2299bf-kube-api-access-295r4\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.646365 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-swift-storage-0\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.646461 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-ovsdbserver-nb\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.646508 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-config\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.646590 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-ovsdbserver-sb\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.646643 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-svc\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.647712 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-svc\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.649007 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-ovsdbserver-sb\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.649228 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-ovsdbserver-nb\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.649421 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-swift-storage-0\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.653392 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-config\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.670823 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-295r4\" (UniqueName: \"kubernetes.io/projected/c00496a5-8aaa-48e0-a156-31a35d2299bf-kube-api-access-295r4\") pod \"dnsmasq-dns-5d8dc5545f-zxwtd\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.827025 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34bce935-6ffd-4772-bc4f-78eed1372e60" path="/var/lib/kubelet/pods/34bce935-6ffd-4772-bc4f-78eed1372e60/volumes" Oct 13 18:34:19 crc kubenswrapper[4974]: I1013 18:34:19.879925 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:20 crc kubenswrapper[4974]: I1013 18:34:20.292900 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"86eb0f8b-123d-4c2a-b7c6-d0a613625ee8","Type":"ContainerStarted","Data":"7da5c8aad70e3910c32c229d54d1115a8e183479739b0e0c2457ce992f8e85cd"} Oct 13 18:34:20 crc kubenswrapper[4974]: I1013 18:34:20.352193 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.352162583 podStartE2EDuration="2.352162583s" podCreationTimestamp="2025-10-13 18:34:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:34:20.315112973 +0000 UTC m=+1195.219479053" watchObservedRunningTime="2025-10-13 18:34:20.352162583 +0000 UTC m=+1195.256528673" Oct 13 18:34:20 crc kubenswrapper[4974]: W1013 18:34:20.364198 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc00496a5_8aaa_48e0_a156_31a35d2299bf.slice/crio-55dbf4046374e846074c0a5a1d86b4e73c4468cd160b34f8f9ced60e1281900e WatchSource:0}: Error finding container 55dbf4046374e846074c0a5a1d86b4e73c4468cd160b34f8f9ced60e1281900e: Status 404 returned error can't find the container with id 55dbf4046374e846074c0a5a1d86b4e73c4468cd160b34f8f9ced60e1281900e Oct 13 18:34:20 crc kubenswrapper[4974]: I1013 18:34:20.366347 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d8dc5545f-zxwtd"] Oct 13 18:34:21 crc kubenswrapper[4974]: I1013 18:34:21.303556 4974 generic.go:334] "Generic (PLEG): container finished" podID="c00496a5-8aaa-48e0-a156-31a35d2299bf" containerID="55163cfaa7e18ac43af395568c0b420dbc9d6ffbbb2bfd7be356befc68c405c9" exitCode=0 Oct 13 18:34:21 crc kubenswrapper[4974]: I1013 18:34:21.305686 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" event={"ID":"c00496a5-8aaa-48e0-a156-31a35d2299bf","Type":"ContainerDied","Data":"55163cfaa7e18ac43af395568c0b420dbc9d6ffbbb2bfd7be356befc68c405c9"} Oct 13 18:34:21 crc kubenswrapper[4974]: I1013 18:34:21.305723 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" event={"ID":"c00496a5-8aaa-48e0-a156-31a35d2299bf","Type":"ContainerStarted","Data":"55dbf4046374e846074c0a5a1d86b4e73c4468cd160b34f8f9ced60e1281900e"} Oct 13 18:34:21 crc kubenswrapper[4974]: I1013 18:34:21.947238 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:34:22 crc kubenswrapper[4974]: I1013 18:34:22.314100 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" containerName="nova-api-log" containerID="cri-o://45d13f3258b9ae888d48e66a8c2cdb9c6913d0bd762bee40b7c703f05413edc6" gracePeriod=30 Oct 13 18:34:22 crc kubenswrapper[4974]: I1013 18:34:22.315185 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" event={"ID":"c00496a5-8aaa-48e0-a156-31a35d2299bf","Type":"ContainerStarted","Data":"ef537a173a2dc3838e49a4b1626f35165d15576ca6e9f530b590d4e5f8709072"} Oct 13 18:34:22 crc kubenswrapper[4974]: I1013 18:34:22.315223 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:22 crc kubenswrapper[4974]: I1013 18:34:22.315529 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" containerName="nova-api-api" containerID="cri-o://f54ae4dbb8a52d2adfd595700cd49cb809ad589947325d3b2d8bb53908dfb377" gracePeriod=30 Oct 13 18:34:22 crc kubenswrapper[4974]: I1013 18:34:22.353042 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" podStartSLOduration=3.353026524 podStartE2EDuration="3.353026524s" podCreationTimestamp="2025-10-13 18:34:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:34:22.351143981 +0000 UTC m=+1197.255510061" watchObservedRunningTime="2025-10-13 18:34:22.353026524 +0000 UTC m=+1197.257392604" Oct 13 18:34:23 crc kubenswrapper[4974]: I1013 18:34:23.190437 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:34:23 crc kubenswrapper[4974]: I1013 18:34:23.191814 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="ceilometer-central-agent" containerID="cri-o://c52aabfef332cc4dda8b2cfdd1c2e5a3f1f3030ff41a4fe490ffb8cd9fdc98f4" gracePeriod=30 Oct 13 18:34:23 crc kubenswrapper[4974]: I1013 18:34:23.192194 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="proxy-httpd" containerID="cri-o://29f125fc4dce659e6f293987d08ade7a2608d844f008f3a491681cb5103d0077" gracePeriod=30 Oct 13 18:34:23 crc kubenswrapper[4974]: I1013 18:34:23.192237 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="sg-core" containerID="cri-o://30c4533ea9b2d1d827cad3870c8a2364fcba5120f0d88a2158276d93e8e5e5d4" gracePeriod=30 Oct 13 18:34:23 crc kubenswrapper[4974]: I1013 18:34:23.192269 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="ceilometer-notification-agent" containerID="cri-o://aed60a5c13f637f89fa1e08e9857003f10646a88f4d3cd9ff45783c355390c62" gracePeriod=30 Oct 13 18:34:23 crc kubenswrapper[4974]: I1013 18:34:23.325122 4974 generic.go:334] "Generic (PLEG): container finished" podID="616aad84-5d9d-41d6-9342-67199ffade4a" containerID="30c4533ea9b2d1d827cad3870c8a2364fcba5120f0d88a2158276d93e8e5e5d4" exitCode=2 Oct 13 18:34:23 crc kubenswrapper[4974]: I1013 18:34:23.325265 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"616aad84-5d9d-41d6-9342-67199ffade4a","Type":"ContainerDied","Data":"30c4533ea9b2d1d827cad3870c8a2364fcba5120f0d88a2158276d93e8e5e5d4"} Oct 13 18:34:23 crc kubenswrapper[4974]: I1013 18:34:23.329606 4974 generic.go:334] "Generic (PLEG): container finished" podID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" containerID="45d13f3258b9ae888d48e66a8c2cdb9c6913d0bd762bee40b7c703f05413edc6" exitCode=143 Oct 13 18:34:23 crc kubenswrapper[4974]: I1013 18:34:23.329680 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2","Type":"ContainerDied","Data":"45d13f3258b9ae888d48e66a8c2cdb9c6913d0bd762bee40b7c703f05413edc6"} Oct 13 18:34:23 crc kubenswrapper[4974]: I1013 18:34:23.654553 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.348766 4974 generic.go:334] "Generic (PLEG): container finished" podID="616aad84-5d9d-41d6-9342-67199ffade4a" containerID="29f125fc4dce659e6f293987d08ade7a2608d844f008f3a491681cb5103d0077" exitCode=0 Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.349080 4974 generic.go:334] "Generic (PLEG): container finished" podID="616aad84-5d9d-41d6-9342-67199ffade4a" containerID="c52aabfef332cc4dda8b2cfdd1c2e5a3f1f3030ff41a4fe490ffb8cd9fdc98f4" exitCode=0 Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.349123 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"616aad84-5d9d-41d6-9342-67199ffade4a","Type":"ContainerDied","Data":"29f125fc4dce659e6f293987d08ade7a2608d844f008f3a491681cb5103d0077"} Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.349175 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"616aad84-5d9d-41d6-9342-67199ffade4a","Type":"ContainerDied","Data":"c52aabfef332cc4dda8b2cfdd1c2e5a3f1f3030ff41a4fe490ffb8cd9fdc98f4"} Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.366052 4974 generic.go:334] "Generic (PLEG): container finished" podID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" containerID="f54ae4dbb8a52d2adfd595700cd49cb809ad589947325d3b2d8bb53908dfb377" exitCode=0 Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.366110 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2","Type":"ContainerDied","Data":"f54ae4dbb8a52d2adfd595700cd49cb809ad589947325d3b2d8bb53908dfb377"} Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.587874 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.687426 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-config-data\") pod \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.687589 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gs5pn\" (UniqueName: \"kubernetes.io/projected/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-kube-api-access-gs5pn\") pod \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.687625 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-logs\") pod \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.687687 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-combined-ca-bundle\") pod \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\" (UID: \"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2\") " Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.691177 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-logs" (OuterVolumeSpecName: "logs") pod "4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" (UID: "4ee0f991-20eb-43f3-bdb0-36c0e8af77e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.695922 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-kube-api-access-gs5pn" (OuterVolumeSpecName: "kube-api-access-gs5pn") pod "4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" (UID: "4ee0f991-20eb-43f3-bdb0-36c0e8af77e2"). InnerVolumeSpecName "kube-api-access-gs5pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.729179 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" (UID: "4ee0f991-20eb-43f3-bdb0-36c0e8af77e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.731786 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-config-data" (OuterVolumeSpecName: "config-data") pod "4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" (UID: "4ee0f991-20eb-43f3-bdb0-36c0e8af77e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.790025 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gs5pn\" (UniqueName: \"kubernetes.io/projected/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-kube-api-access-gs5pn\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.790311 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.790323 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:24 crc kubenswrapper[4974]: I1013 18:34:24.790330 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.387144 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4ee0f991-20eb-43f3-bdb0-36c0e8af77e2","Type":"ContainerDied","Data":"5af436d632b0ed3210d9cb46af895875d8b6a4331eec534942118bb4130b89dc"} Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.387210 4974 scope.go:117] "RemoveContainer" containerID="f54ae4dbb8a52d2adfd595700cd49cb809ad589947325d3b2d8bb53908dfb377" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.387251 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.426405 4974 scope.go:117] "RemoveContainer" containerID="45d13f3258b9ae888d48e66a8c2cdb9c6913d0bd762bee40b7c703f05413edc6" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.429336 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.447334 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.462786 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 18:34:25 crc kubenswrapper[4974]: E1013 18:34:25.463389 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" containerName="nova-api-api" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.463418 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" containerName="nova-api-api" Oct 13 18:34:25 crc kubenswrapper[4974]: E1013 18:34:25.463487 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" containerName="nova-api-log" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.463499 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" containerName="nova-api-log" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.463931 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" containerName="nova-api-log" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.463965 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" containerName="nova-api-api" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.465770 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.469277 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.469710 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.471952 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.472814 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.607035 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-public-tls-certs\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.607115 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-internal-tls-certs\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.607179 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-logs\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.607241 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.607386 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-config-data\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.607491 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmptq\" (UniqueName: \"kubernetes.io/projected/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-kube-api-access-tmptq\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.709244 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-config-data\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.709345 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmptq\" (UniqueName: \"kubernetes.io/projected/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-kube-api-access-tmptq\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.709717 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-public-tls-certs\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.709756 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-internal-tls-certs\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.709789 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-logs\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.710190 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-logs\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.710400 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.715956 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-internal-tls-certs\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.716430 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.716673 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-public-tls-certs\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.717345 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-config-data\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.732259 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmptq\" (UniqueName: \"kubernetes.io/projected/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-kube-api-access-tmptq\") pod \"nova-api-0\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.830204 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:34:25 crc kubenswrapper[4974]: I1013 18:34:25.834648 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ee0f991-20eb-43f3-bdb0-36c0e8af77e2" path="/var/lib/kubelet/pods/4ee0f991-20eb-43f3-bdb0-36c0e8af77e2/volumes" Oct 13 18:34:26 crc kubenswrapper[4974]: W1013 18:34:26.325239 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52583ef5_6ec7_4ae1_a2bf_5d69edbce037.slice/crio-9b9e58cfc2dc78ff15304bb9790b881841c9f3ec67bf947ee0775b0bc371467f WatchSource:0}: Error finding container 9b9e58cfc2dc78ff15304bb9790b881841c9f3ec67bf947ee0775b0bc371467f: Status 404 returned error can't find the container with id 9b9e58cfc2dc78ff15304bb9790b881841c9f3ec67bf947ee0775b0bc371467f Oct 13 18:34:26 crc kubenswrapper[4974]: I1013 18:34:26.329814 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:34:26 crc kubenswrapper[4974]: I1013 18:34:26.401556 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52583ef5-6ec7-4ae1-a2bf-5d69edbce037","Type":"ContainerStarted","Data":"9b9e58cfc2dc78ff15304bb9790b881841c9f3ec67bf947ee0775b0bc371467f"} Oct 13 18:34:27 crc kubenswrapper[4974]: I1013 18:34:27.418876 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52583ef5-6ec7-4ae1-a2bf-5d69edbce037","Type":"ContainerStarted","Data":"2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c"} Oct 13 18:34:27 crc kubenswrapper[4974]: I1013 18:34:27.419324 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52583ef5-6ec7-4ae1-a2bf-5d69edbce037","Type":"ContainerStarted","Data":"76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0"} Oct 13 18:34:27 crc kubenswrapper[4974]: I1013 18:34:27.454375 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.454350305 podStartE2EDuration="2.454350305s" podCreationTimestamp="2025-10-13 18:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:34:27.438200075 +0000 UTC m=+1202.342566195" watchObservedRunningTime="2025-10-13 18:34:27.454350305 +0000 UTC m=+1202.358716395" Oct 13 18:34:28 crc kubenswrapper[4974]: I1013 18:34:28.654054 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:28 crc kubenswrapper[4974]: I1013 18:34:28.672852 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.464941 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.667220 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-gvkwd"] Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.669380 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.674241 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.674835 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.680063 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gvkwd"] Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.803788 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gvkwd\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.803933 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5hzj\" (UniqueName: \"kubernetes.io/projected/02183b03-008b-42e4-89f7-7b1186eada64-kube-api-access-t5hzj\") pod \"nova-cell1-cell-mapping-gvkwd\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.803992 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-scripts\") pod \"nova-cell1-cell-mapping-gvkwd\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.804639 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-config-data\") pod \"nova-cell1-cell-mapping-gvkwd\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.880878 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.907556 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-config-data\") pod \"nova-cell1-cell-mapping-gvkwd\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.907680 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gvkwd\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.907763 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5hzj\" (UniqueName: \"kubernetes.io/projected/02183b03-008b-42e4-89f7-7b1186eada64-kube-api-access-t5hzj\") pod \"nova-cell1-cell-mapping-gvkwd\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.907795 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-scripts\") pod \"nova-cell1-cell-mapping-gvkwd\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.917076 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-gvkwd\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.918221 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-scripts\") pod \"nova-cell1-cell-mapping-gvkwd\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.923838 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-config-data\") pod \"nova-cell1-cell-mapping-gvkwd\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.953374 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5hzj\" (UniqueName: \"kubernetes.io/projected/02183b03-008b-42e4-89f7-7b1186eada64-kube-api-access-t5hzj\") pod \"nova-cell1-cell-mapping-gvkwd\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.958428 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75d8d75995-bsnd6"] Oct 13 18:34:29 crc kubenswrapper[4974]: I1013 18:34:29.961193 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" podUID="6d898346-454c-4164-ba27-25363b7a75cb" containerName="dnsmasq-dns" containerID="cri-o://18c068ad3700e763b5e36120aff96bf845ccd0dd0a63e9e1d87076ba9a268b70" gracePeriod=10 Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.004242 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.455326 4974 generic.go:334] "Generic (PLEG): container finished" podID="6d898346-454c-4164-ba27-25363b7a75cb" containerID="18c068ad3700e763b5e36120aff96bf845ccd0dd0a63e9e1d87076ba9a268b70" exitCode=0 Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.456987 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" event={"ID":"6d898346-454c-4164-ba27-25363b7a75cb","Type":"ContainerDied","Data":"18c068ad3700e763b5e36120aff96bf845ccd0dd0a63e9e1d87076ba9a268b70"} Oct 13 18:34:30 crc kubenswrapper[4974]: W1013 18:34:30.509838 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02183b03_008b_42e4_89f7_7b1186eada64.slice/crio-16334175d006fa8f22883c02ccffc0915a198b83aaede75705e395c975ef405b WatchSource:0}: Error finding container 16334175d006fa8f22883c02ccffc0915a198b83aaede75705e395c975ef405b: Status 404 returned error can't find the container with id 16334175d006fa8f22883c02ccffc0915a198b83aaede75705e395c975ef405b Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.516103 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-gvkwd"] Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.543486 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.642422 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-ovsdbserver-sb\") pod \"6d898346-454c-4164-ba27-25363b7a75cb\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.642510 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-dns-swift-storage-0\") pod \"6d898346-454c-4164-ba27-25363b7a75cb\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.642547 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-ovsdbserver-nb\") pod \"6d898346-454c-4164-ba27-25363b7a75cb\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.642582 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-config\") pod \"6d898346-454c-4164-ba27-25363b7a75cb\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.642680 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtkkv\" (UniqueName: \"kubernetes.io/projected/6d898346-454c-4164-ba27-25363b7a75cb-kube-api-access-wtkkv\") pod \"6d898346-454c-4164-ba27-25363b7a75cb\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.642724 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-dns-svc\") pod \"6d898346-454c-4164-ba27-25363b7a75cb\" (UID: \"6d898346-454c-4164-ba27-25363b7a75cb\") " Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.649986 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d898346-454c-4164-ba27-25363b7a75cb-kube-api-access-wtkkv" (OuterVolumeSpecName: "kube-api-access-wtkkv") pod "6d898346-454c-4164-ba27-25363b7a75cb" (UID: "6d898346-454c-4164-ba27-25363b7a75cb"). InnerVolumeSpecName "kube-api-access-wtkkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.718373 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-config" (OuterVolumeSpecName: "config") pod "6d898346-454c-4164-ba27-25363b7a75cb" (UID: "6d898346-454c-4164-ba27-25363b7a75cb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.719971 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6d898346-454c-4164-ba27-25363b7a75cb" (UID: "6d898346-454c-4164-ba27-25363b7a75cb"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.720415 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6d898346-454c-4164-ba27-25363b7a75cb" (UID: "6d898346-454c-4164-ba27-25363b7a75cb"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.727485 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6d898346-454c-4164-ba27-25363b7a75cb" (UID: "6d898346-454c-4164-ba27-25363b7a75cb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.732258 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6d898346-454c-4164-ba27-25363b7a75cb" (UID: "6d898346-454c-4164-ba27-25363b7a75cb"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.745837 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtkkv\" (UniqueName: \"kubernetes.io/projected/6d898346-454c-4164-ba27-25363b7a75cb-kube-api-access-wtkkv\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.745869 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.745880 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.745889 4974 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.745897 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:30 crc kubenswrapper[4974]: I1013 18:34:30.745906 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6d898346-454c-4164-ba27-25363b7a75cb-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.469476 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" event={"ID":"6d898346-454c-4164-ba27-25363b7a75cb","Type":"ContainerDied","Data":"e1a7c1adad193fface5d48b2dc51eac0279c0f4f0186c9f02487dc8a0b419696"} Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.469813 4974 scope.go:117] "RemoveContainer" containerID="18c068ad3700e763b5e36120aff96bf845ccd0dd0a63e9e1d87076ba9a268b70" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.469930 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75d8d75995-bsnd6" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.472026 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gvkwd" event={"ID":"02183b03-008b-42e4-89f7-7b1186eada64","Type":"ContainerStarted","Data":"bd40b0f1661d69f3d7d8721979684d97ed639a7713fb21ba07d3f425d47ffd53"} Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.472069 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gvkwd" event={"ID":"02183b03-008b-42e4-89f7-7b1186eada64","Type":"ContainerStarted","Data":"16334175d006fa8f22883c02ccffc0915a198b83aaede75705e395c975ef405b"} Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.477578 4974 generic.go:334] "Generic (PLEG): container finished" podID="616aad84-5d9d-41d6-9342-67199ffade4a" containerID="aed60a5c13f637f89fa1e08e9857003f10646a88f4d3cd9ff45783c355390c62" exitCode=0 Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.477617 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"616aad84-5d9d-41d6-9342-67199ffade4a","Type":"ContainerDied","Data":"aed60a5c13f637f89fa1e08e9857003f10646a88f4d3cd9ff45783c355390c62"} Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.477640 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"616aad84-5d9d-41d6-9342-67199ffade4a","Type":"ContainerDied","Data":"2a5addbc223f2ddebd5eb30034700bf1f979f3cc54f7d1e1c9921fa42dd043ba"} Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.477661 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a5addbc223f2ddebd5eb30034700bf1f979f3cc54f7d1e1c9921fa42dd043ba" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.492414 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-gvkwd" podStartSLOduration=2.492397294 podStartE2EDuration="2.492397294s" podCreationTimestamp="2025-10-13 18:34:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:34:31.490871281 +0000 UTC m=+1206.395237371" watchObservedRunningTime="2025-10-13 18:34:31.492397294 +0000 UTC m=+1206.396763374" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.561408 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.582290 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75d8d75995-bsnd6"] Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.582505 4974 scope.go:117] "RemoveContainer" containerID="aafa3f24015eeebd60bed988b4136b8e4f9d40ad22d2267a398d88f7ced859ca" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.591782 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75d8d75995-bsnd6"] Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.673560 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-ceilometer-tls-certs\") pod \"616aad84-5d9d-41d6-9342-67199ffade4a\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.673625 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2sqv\" (UniqueName: \"kubernetes.io/projected/616aad84-5d9d-41d6-9342-67199ffade4a-kube-api-access-f2sqv\") pod \"616aad84-5d9d-41d6-9342-67199ffade4a\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.673644 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-combined-ca-bundle\") pod \"616aad84-5d9d-41d6-9342-67199ffade4a\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.673728 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/616aad84-5d9d-41d6-9342-67199ffade4a-run-httpd\") pod \"616aad84-5d9d-41d6-9342-67199ffade4a\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.673766 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-scripts\") pod \"616aad84-5d9d-41d6-9342-67199ffade4a\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.673803 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-config-data\") pod \"616aad84-5d9d-41d6-9342-67199ffade4a\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.673826 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-sg-core-conf-yaml\") pod \"616aad84-5d9d-41d6-9342-67199ffade4a\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.673961 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/616aad84-5d9d-41d6-9342-67199ffade4a-log-httpd\") pod \"616aad84-5d9d-41d6-9342-67199ffade4a\" (UID: \"616aad84-5d9d-41d6-9342-67199ffade4a\") " Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.674079 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/616aad84-5d9d-41d6-9342-67199ffade4a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "616aad84-5d9d-41d6-9342-67199ffade4a" (UID: "616aad84-5d9d-41d6-9342-67199ffade4a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.674409 4974 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/616aad84-5d9d-41d6-9342-67199ffade4a-run-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.674898 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/616aad84-5d9d-41d6-9342-67199ffade4a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "616aad84-5d9d-41d6-9342-67199ffade4a" (UID: "616aad84-5d9d-41d6-9342-67199ffade4a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.680944 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/616aad84-5d9d-41d6-9342-67199ffade4a-kube-api-access-f2sqv" (OuterVolumeSpecName: "kube-api-access-f2sqv") pod "616aad84-5d9d-41d6-9342-67199ffade4a" (UID: "616aad84-5d9d-41d6-9342-67199ffade4a"). InnerVolumeSpecName "kube-api-access-f2sqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.683386 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-scripts" (OuterVolumeSpecName: "scripts") pod "616aad84-5d9d-41d6-9342-67199ffade4a" (UID: "616aad84-5d9d-41d6-9342-67199ffade4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.711363 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "616aad84-5d9d-41d6-9342-67199ffade4a" (UID: "616aad84-5d9d-41d6-9342-67199ffade4a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.733851 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "616aad84-5d9d-41d6-9342-67199ffade4a" (UID: "616aad84-5d9d-41d6-9342-67199ffade4a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.758565 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "616aad84-5d9d-41d6-9342-67199ffade4a" (UID: "616aad84-5d9d-41d6-9342-67199ffade4a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.776486 4974 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/616aad84-5d9d-41d6-9342-67199ffade4a-log-httpd\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.776558 4974 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.776604 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2sqv\" (UniqueName: \"kubernetes.io/projected/616aad84-5d9d-41d6-9342-67199ffade4a-kube-api-access-f2sqv\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.776614 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.776622 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.776629 4974 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.795925 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-config-data" (OuterVolumeSpecName: "config-data") pod "616aad84-5d9d-41d6-9342-67199ffade4a" (UID: "616aad84-5d9d-41d6-9342-67199ffade4a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.829756 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d898346-454c-4164-ba27-25363b7a75cb" path="/var/lib/kubelet/pods/6d898346-454c-4164-ba27-25363b7a75cb/volumes" Oct 13 18:34:31 crc kubenswrapper[4974]: I1013 18:34:31.887378 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/616aad84-5d9d-41d6-9342-67199ffade4a-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.495222 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.532614 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.543937 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.608210 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:34:32 crc kubenswrapper[4974]: E1013 18:34:32.610090 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="ceilometer-notification-agent" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.610112 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="ceilometer-notification-agent" Oct 13 18:34:32 crc kubenswrapper[4974]: E1013 18:34:32.610146 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d898346-454c-4164-ba27-25363b7a75cb" containerName="init" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.610154 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d898346-454c-4164-ba27-25363b7a75cb" containerName="init" Oct 13 18:34:32 crc kubenswrapper[4974]: E1013 18:34:32.610189 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="sg-core" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.610203 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="sg-core" Oct 13 18:34:32 crc kubenswrapper[4974]: E1013 18:34:32.610248 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="ceilometer-central-agent" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.610257 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="ceilometer-central-agent" Oct 13 18:34:32 crc kubenswrapper[4974]: E1013 18:34:32.610278 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="proxy-httpd" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.610286 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="proxy-httpd" Oct 13 18:34:32 crc kubenswrapper[4974]: E1013 18:34:32.610310 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d898346-454c-4164-ba27-25363b7a75cb" containerName="dnsmasq-dns" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.610318 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d898346-454c-4164-ba27-25363b7a75cb" containerName="dnsmasq-dns" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.610783 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d898346-454c-4164-ba27-25363b7a75cb" containerName="dnsmasq-dns" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.610818 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="sg-core" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.610856 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="ceilometer-central-agent" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.610878 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="ceilometer-notification-agent" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.610911 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" containerName="proxy-httpd" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.615521 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.618498 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.620511 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.621432 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.627925 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.706622 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.706918 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-config-data\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.707011 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-scripts\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.707127 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9rhb\" (UniqueName: \"kubernetes.io/projected/cc0eb77b-ce25-42b6-a03f-600b090be522-kube-api-access-l9rhb\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.707193 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.707264 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc0eb77b-ce25-42b6-a03f-600b090be522-log-httpd\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.707350 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.707424 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc0eb77b-ce25-42b6-a03f-600b090be522-run-httpd\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.808827 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.809467 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-config-data\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.809565 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-scripts\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.809733 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9rhb\" (UniqueName: \"kubernetes.io/projected/cc0eb77b-ce25-42b6-a03f-600b090be522-kube-api-access-l9rhb\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.809812 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.809894 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc0eb77b-ce25-42b6-a03f-600b090be522-log-httpd\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.810035 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.810132 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc0eb77b-ce25-42b6-a03f-600b090be522-run-httpd\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.810554 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc0eb77b-ce25-42b6-a03f-600b090be522-run-httpd\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.810922 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/cc0eb77b-ce25-42b6-a03f-600b090be522-log-httpd\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.814770 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.814902 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-scripts\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.815097 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-config-data\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.815424 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.817994 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc0eb77b-ce25-42b6-a03f-600b090be522-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.827380 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9rhb\" (UniqueName: \"kubernetes.io/projected/cc0eb77b-ce25-42b6-a03f-600b090be522-kube-api-access-l9rhb\") pod \"ceilometer-0\" (UID: \"cc0eb77b-ce25-42b6-a03f-600b090be522\") " pod="openstack/ceilometer-0" Oct 13 18:34:32 crc kubenswrapper[4974]: I1013 18:34:32.933294 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Oct 13 18:34:33 crc kubenswrapper[4974]: I1013 18:34:33.450255 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Oct 13 18:34:33 crc kubenswrapper[4974]: I1013 18:34:33.508070 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc0eb77b-ce25-42b6-a03f-600b090be522","Type":"ContainerStarted","Data":"d2789b6e55e4ca9c64c8922a95911e841067528e32d5f7087262843773e07646"} Oct 13 18:34:33 crc kubenswrapper[4974]: I1013 18:34:33.821413 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="616aad84-5d9d-41d6-9342-67199ffade4a" path="/var/lib/kubelet/pods/616aad84-5d9d-41d6-9342-67199ffade4a/volumes" Oct 13 18:34:34 crc kubenswrapper[4974]: I1013 18:34:34.518437 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc0eb77b-ce25-42b6-a03f-600b090be522","Type":"ContainerStarted","Data":"6c652d00955a9519b899505fd54166377820d8d0fc4eac294e0e561465f2dc26"} Oct 13 18:34:34 crc kubenswrapper[4974]: I1013 18:34:34.518822 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc0eb77b-ce25-42b6-a03f-600b090be522","Type":"ContainerStarted","Data":"3880047ebdd5d4ac3c48f1dc17ec51b579b3755899ba6a69fe217f798317513d"} Oct 13 18:34:35 crc kubenswrapper[4974]: I1013 18:34:35.532376 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc0eb77b-ce25-42b6-a03f-600b090be522","Type":"ContainerStarted","Data":"4bf70237e29bb6be49f2578e99ff5fe57c199a6ea03b6b364396c452719e7d83"} Oct 13 18:34:35 crc kubenswrapper[4974]: I1013 18:34:35.837455 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 18:34:35 crc kubenswrapper[4974]: I1013 18:34:35.837752 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 18:34:36 crc kubenswrapper[4974]: I1013 18:34:36.552405 4974 generic.go:334] "Generic (PLEG): container finished" podID="02183b03-008b-42e4-89f7-7b1186eada64" containerID="bd40b0f1661d69f3d7d8721979684d97ed639a7713fb21ba07d3f425d47ffd53" exitCode=0 Oct 13 18:34:36 crc kubenswrapper[4974]: I1013 18:34:36.552452 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gvkwd" event={"ID":"02183b03-008b-42e4-89f7-7b1186eada64","Type":"ContainerDied","Data":"bd40b0f1661d69f3d7d8721979684d97ed639a7713fb21ba07d3f425d47ffd53"} Oct 13 18:34:36 crc kubenswrapper[4974]: I1013 18:34:36.868177 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.227:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 18:34:36 crc kubenswrapper[4974]: I1013 18:34:36.868209 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.227:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 18:34:37 crc kubenswrapper[4974]: I1013 18:34:37.576002 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"cc0eb77b-ce25-42b6-a03f-600b090be522","Type":"ContainerStarted","Data":"4aab277064b79ce60eeab9230620fab04dd65506d13f585246c751619c21ff8f"} Oct 13 18:34:37 crc kubenswrapper[4974]: I1013 18:34:37.576072 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Oct 13 18:34:37 crc kubenswrapper[4974]: I1013 18:34:37.624213 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.464068565 podStartE2EDuration="5.624189107s" podCreationTimestamp="2025-10-13 18:34:32 +0000 UTC" firstStartedPulling="2025-10-13 18:34:33.459271108 +0000 UTC m=+1208.363637188" lastFinishedPulling="2025-10-13 18:34:36.61939164 +0000 UTC m=+1211.523757730" observedRunningTime="2025-10-13 18:34:37.608577412 +0000 UTC m=+1212.512943502" watchObservedRunningTime="2025-10-13 18:34:37.624189107 +0000 UTC m=+1212.528555197" Oct 13 18:34:37 crc kubenswrapper[4974]: I1013 18:34:37.743345 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:34:37 crc kubenswrapper[4974]: I1013 18:34:37.743410 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.043363 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.120266 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-scripts\") pod \"02183b03-008b-42e4-89f7-7b1186eada64\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.120435 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t5hzj\" (UniqueName: \"kubernetes.io/projected/02183b03-008b-42e4-89f7-7b1186eada64-kube-api-access-t5hzj\") pod \"02183b03-008b-42e4-89f7-7b1186eada64\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.122803 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-config-data\") pod \"02183b03-008b-42e4-89f7-7b1186eada64\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.123007 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-combined-ca-bundle\") pod \"02183b03-008b-42e4-89f7-7b1186eada64\" (UID: \"02183b03-008b-42e4-89f7-7b1186eada64\") " Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.132877 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-scripts" (OuterVolumeSpecName: "scripts") pod "02183b03-008b-42e4-89f7-7b1186eada64" (UID: "02183b03-008b-42e4-89f7-7b1186eada64"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.134353 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02183b03-008b-42e4-89f7-7b1186eada64-kube-api-access-t5hzj" (OuterVolumeSpecName: "kube-api-access-t5hzj") pod "02183b03-008b-42e4-89f7-7b1186eada64" (UID: "02183b03-008b-42e4-89f7-7b1186eada64"). InnerVolumeSpecName "kube-api-access-t5hzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.162941 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "02183b03-008b-42e4-89f7-7b1186eada64" (UID: "02183b03-008b-42e4-89f7-7b1186eada64"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.174979 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-config-data" (OuterVolumeSpecName: "config-data") pod "02183b03-008b-42e4-89f7-7b1186eada64" (UID: "02183b03-008b-42e4-89f7-7b1186eada64"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.225486 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.225517 4974 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-scripts\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.225528 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t5hzj\" (UniqueName: \"kubernetes.io/projected/02183b03-008b-42e4-89f7-7b1186eada64-kube-api-access-t5hzj\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.225539 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/02183b03-008b-42e4-89f7-7b1186eada64-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.586647 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-gvkwd" Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.590952 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-gvkwd" event={"ID":"02183b03-008b-42e4-89f7-7b1186eada64","Type":"ContainerDied","Data":"16334175d006fa8f22883c02ccffc0915a198b83aaede75705e395c975ef405b"} Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.591032 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16334175d006fa8f22883c02ccffc0915a198b83aaede75705e395c975ef405b" Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.765346 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.766193 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" containerName="nova-api-log" containerID="cri-o://76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0" gracePeriod=30 Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.766253 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" containerName="nova-api-api" containerID="cri-o://2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c" gracePeriod=30 Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.779194 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.779399 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="bbb2ba38-77ca-4954-998c-ca99c34e0b3c" containerName="nova-scheduler-scheduler" containerID="cri-o://3e2285772d7905b9bda0126dc94d24d559183db00d7b16cfb8c16c8db10d82a4" gracePeriod=30 Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.844103 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.844319 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerName="nova-metadata-log" containerID="cri-o://b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3" gracePeriod=30 Oct 13 18:34:38 crc kubenswrapper[4974]: I1013 18:34:38.844363 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerName="nova-metadata-metadata" containerID="cri-o://316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee" gracePeriod=30 Oct 13 18:34:39 crc kubenswrapper[4974]: I1013 18:34:39.597937 4974 generic.go:334] "Generic (PLEG): container finished" podID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" containerID="76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0" exitCode=143 Oct 13 18:34:39 crc kubenswrapper[4974]: I1013 18:34:39.598005 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52583ef5-6ec7-4ae1-a2bf-5d69edbce037","Type":"ContainerDied","Data":"76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0"} Oct 13 18:34:39 crc kubenswrapper[4974]: I1013 18:34:39.599970 4974 generic.go:334] "Generic (PLEG): container finished" podID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerID="b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3" exitCode=143 Oct 13 18:34:39 crc kubenswrapper[4974]: I1013 18:34:39.600031 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f","Type":"ContainerDied","Data":"b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3"} Oct 13 18:34:39 crc kubenswrapper[4974]: I1013 18:34:39.709520 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": read tcp 10.217.0.2:58938->10.217.0.221:8775: read: connection reset by peer" Oct 13 18:34:39 crc kubenswrapper[4974]: I1013 18:34:39.709635 4974 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.221:8775/\": read tcp 10.217.0.2:58936->10.217.0.221:8775: read: connection reset by peer" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.170162 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.270632 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-public-tls-certs\") pod \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.270715 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmptq\" (UniqueName: \"kubernetes.io/projected/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-kube-api-access-tmptq\") pod \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.270743 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-config-data\") pod \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.270823 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-internal-tls-certs\") pod \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.270915 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-combined-ca-bundle\") pod \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.270968 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-logs\") pod \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\" (UID: \"52583ef5-6ec7-4ae1-a2bf-5d69edbce037\") " Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.274047 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-logs" (OuterVolumeSpecName: "logs") pod "52583ef5-6ec7-4ae1-a2bf-5d69edbce037" (UID: "52583ef5-6ec7-4ae1-a2bf-5d69edbce037"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.279892 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-kube-api-access-tmptq" (OuterVolumeSpecName: "kube-api-access-tmptq") pod "52583ef5-6ec7-4ae1-a2bf-5d69edbce037" (UID: "52583ef5-6ec7-4ae1-a2bf-5d69edbce037"). InnerVolumeSpecName "kube-api-access-tmptq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.302313 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52583ef5-6ec7-4ae1-a2bf-5d69edbce037" (UID: "52583ef5-6ec7-4ae1-a2bf-5d69edbce037"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.305043 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-config-data" (OuterVolumeSpecName: "config-data") pod "52583ef5-6ec7-4ae1-a2bf-5d69edbce037" (UID: "52583ef5-6ec7-4ae1-a2bf-5d69edbce037"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.331178 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.337331 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "52583ef5-6ec7-4ae1-a2bf-5d69edbce037" (UID: "52583ef5-6ec7-4ae1-a2bf-5d69edbce037"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.373414 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msxsn\" (UniqueName: \"kubernetes.io/projected/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-kube-api-access-msxsn\") pod \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.373519 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-logs\") pod \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.373642 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-config-data\") pod \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.374388 4974 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.374413 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.374429 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.374443 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmptq\" (UniqueName: \"kubernetes.io/projected/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-kube-api-access-tmptq\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.374461 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.378011 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-logs" (OuterVolumeSpecName: "logs") pod "d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" (UID: "d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.378321 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-kube-api-access-msxsn" (OuterVolumeSpecName: "kube-api-access-msxsn") pod "d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" (UID: "d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f"). InnerVolumeSpecName "kube-api-access-msxsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.382871 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "52583ef5-6ec7-4ae1-a2bf-5d69edbce037" (UID: "52583ef5-6ec7-4ae1-a2bf-5d69edbce037"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.412626 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-config-data" (OuterVolumeSpecName: "config-data") pod "d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" (UID: "d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.475867 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-combined-ca-bundle\") pod \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.475950 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-nova-metadata-tls-certs\") pod \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\" (UID: \"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f\") " Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.476422 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msxsn\" (UniqueName: \"kubernetes.io/projected/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-kube-api-access-msxsn\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.476440 4974 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52583ef5-6ec7-4ae1-a2bf-5d69edbce037-public-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.476450 4974 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-logs\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.476458 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.501027 4974 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3e2285772d7905b9bda0126dc94d24d559183db00d7b16cfb8c16c8db10d82a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.503712 4974 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3e2285772d7905b9bda0126dc94d24d559183db00d7b16cfb8c16c8db10d82a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.505381 4974 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3e2285772d7905b9bda0126dc94d24d559183db00d7b16cfb8c16c8db10d82a4" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.505438 4974 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="bbb2ba38-77ca-4954-998c-ca99c34e0b3c" containerName="nova-scheduler-scheduler" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.513943 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" (UID: "d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.529837 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" (UID: "d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.581413 4974 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.581455 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.609575 4974 generic.go:334] "Generic (PLEG): container finished" podID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerID="316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee" exitCode=0 Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.609617 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f","Type":"ContainerDied","Data":"316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee"} Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.609652 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.609692 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f","Type":"ContainerDied","Data":"9d1e31ecacae10e82a1049317a4503ba8949a1bed055bb6f9ebf349c1d2a6cc4"} Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.609712 4974 scope.go:117] "RemoveContainer" containerID="316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.611609 4974 generic.go:334] "Generic (PLEG): container finished" podID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" containerID="2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c" exitCode=0 Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.611641 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52583ef5-6ec7-4ae1-a2bf-5d69edbce037","Type":"ContainerDied","Data":"2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c"} Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.611676 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"52583ef5-6ec7-4ae1-a2bf-5d69edbce037","Type":"ContainerDied","Data":"9b9e58cfc2dc78ff15304bb9790b881841c9f3ec67bf947ee0775b0bc371467f"} Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.611748 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.635792 4974 scope.go:117] "RemoveContainer" containerID="b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.651370 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.667231 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.679423 4974 scope.go:117] "RemoveContainer" containerID="316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee" Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.690764 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee\": container with ID starting with 316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee not found: ID does not exist" containerID="316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.690831 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee"} err="failed to get container status \"316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee\": rpc error: code = NotFound desc = could not find container \"316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee\": container with ID starting with 316056584f241f6e7b40200669c3debc47abbc8b94b0ae786da53a48e5670eee not found: ID does not exist" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.690862 4974 scope.go:117] "RemoveContainer" containerID="b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3" Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.699772 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3\": container with ID starting with b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3 not found: ID does not exist" containerID="b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.699817 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3"} err="failed to get container status \"b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3\": rpc error: code = NotFound desc = could not find container \"b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3\": container with ID starting with b659ac162e6e9c8480e01bc9b4c799d05603e63645e3432af9cac097c45e1be3 not found: ID does not exist" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.699844 4974 scope.go:117] "RemoveContainer" containerID="2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.710374 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.710862 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" containerName="nova-api-log" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.710880 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" containerName="nova-api-log" Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.710914 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" containerName="nova-api-api" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.710921 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" containerName="nova-api-api" Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.710929 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02183b03-008b-42e4-89f7-7b1186eada64" containerName="nova-manage" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.710935 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="02183b03-008b-42e4-89f7-7b1186eada64" containerName="nova-manage" Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.710954 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerName="nova-metadata-log" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.710960 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerName="nova-metadata-log" Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.710976 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerName="nova-metadata-metadata" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.710981 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerName="nova-metadata-metadata" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.711150 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerName="nova-metadata-log" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.711165 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="02183b03-008b-42e4-89f7-7b1186eada64" containerName="nova-manage" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.711173 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" containerName="nova-api-log" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.711186 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" containerName="nova-metadata-metadata" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.711201 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" containerName="nova-api-api" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.712238 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.714881 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.715309 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.734339 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.742605 4974 scope.go:117] "RemoveContainer" containerID="76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.751382 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.763248 4974 scope.go:117] "RemoveContainer" containerID="2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c" Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.764463 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c\": container with ID starting with 2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c not found: ID does not exist" containerID="2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.764502 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c"} err="failed to get container status \"2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c\": rpc error: code = NotFound desc = could not find container \"2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c\": container with ID starting with 2759ab580b260933c82777cfc44bc53f337484b8d9eae5b89546f110cd6be33c not found: ID does not exist" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.764530 4974 scope.go:117] "RemoveContainer" containerID="76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0" Oct 13 18:34:40 crc kubenswrapper[4974]: E1013 18:34:40.764851 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0\": container with ID starting with 76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0 not found: ID does not exist" containerID="76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.764883 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0"} err="failed to get container status \"76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0\": rpc error: code = NotFound desc = could not find container \"76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0\": container with ID starting with 76b69cfc0cd7875a8dfc0c27c49e0110e92350bc7cddd10be7a01965252687a0 not found: ID does not exist" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.769345 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.793864 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.795672 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.802175 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.802362 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.802471 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.814223 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.886785 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46af3042-50b4-462e-9449-4d521fd32afa-logs\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.886835 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46af3042-50b4-462e-9449-4d521fd32afa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.886892 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfq7h\" (UniqueName: \"kubernetes.io/projected/46af3042-50b4-462e-9449-4d521fd32afa-kube-api-access-sfq7h\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.886987 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/46af3042-50b4-462e-9449-4d521fd32afa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.887050 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46af3042-50b4-462e-9449-4d521fd32afa-config-data\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.988267 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-logs\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.988330 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46af3042-50b4-462e-9449-4d521fd32afa-config-data\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.988362 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-config-data\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.988433 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46af3042-50b4-462e-9449-4d521fd32afa-logs\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.988458 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.988552 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46af3042-50b4-462e-9449-4d521fd32afa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.988607 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-public-tls-certs\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.988690 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-internal-tls-certs\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.988849 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfq7h\" (UniqueName: \"kubernetes.io/projected/46af3042-50b4-462e-9449-4d521fd32afa-kube-api-access-sfq7h\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.988915 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjprj\" (UniqueName: \"kubernetes.io/projected/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-kube-api-access-fjprj\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.988968 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/46af3042-50b4-462e-9449-4d521fd32afa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.989286 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46af3042-50b4-462e-9449-4d521fd32afa-logs\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.994620 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/46af3042-50b4-462e-9449-4d521fd32afa-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.994848 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46af3042-50b4-462e-9449-4d521fd32afa-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:40 crc kubenswrapper[4974]: I1013 18:34:40.995173 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46af3042-50b4-462e-9449-4d521fd32afa-config-data\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.011107 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfq7h\" (UniqueName: \"kubernetes.io/projected/46af3042-50b4-462e-9449-4d521fd32afa-kube-api-access-sfq7h\") pod \"nova-metadata-0\" (UID: \"46af3042-50b4-462e-9449-4d521fd32afa\") " pod="openstack/nova-metadata-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.037710 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.090443 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.090506 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-public-tls-certs\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.090539 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-internal-tls-certs\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.090586 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjprj\" (UniqueName: \"kubernetes.io/projected/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-kube-api-access-fjprj\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.090702 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-logs\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.090768 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-config-data\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.091308 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-logs\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.095440 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.108643 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-public-tls-certs\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.109387 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-internal-tls-certs\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.110327 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-config-data\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.110758 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjprj\" (UniqueName: \"kubernetes.io/projected/420e7e88-d552-4dd6-b5f8-b8ec9d8b9354-kube-api-access-fjprj\") pod \"nova-api-0\" (UID: \"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354\") " pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.130235 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.354809 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.632834 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46af3042-50b4-462e-9449-4d521fd32afa","Type":"ContainerStarted","Data":"32f00d790cf2e9501cb34e87cba757fdb2e790d3de54326fd05de2f383a64716"} Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.842741 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52583ef5-6ec7-4ae1-a2bf-5d69edbce037" path="/var/lib/kubelet/pods/52583ef5-6ec7-4ae1-a2bf-5d69edbce037/volumes" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.844220 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f" path="/var/lib/kubelet/pods/d5d3a2a4-1b3d-4cd2-a7ba-6466528bdb2f/volumes" Oct 13 18:34:41 crc kubenswrapper[4974]: I1013 18:34:41.844978 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Oct 13 18:34:41 crc kubenswrapper[4974]: W1013 18:34:41.847762 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod420e7e88_d552_4dd6_b5f8_b8ec9d8b9354.slice/crio-3c7d343560e673306d9f7e62b26c8a30f5dd67e4ab726143d854bd7ff9fb921b WatchSource:0}: Error finding container 3c7d343560e673306d9f7e62b26c8a30f5dd67e4ab726143d854bd7ff9fb921b: Status 404 returned error can't find the container with id 3c7d343560e673306d9f7e62b26c8a30f5dd67e4ab726143d854bd7ff9fb921b Oct 13 18:34:42 crc kubenswrapper[4974]: I1013 18:34:42.648454 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46af3042-50b4-462e-9449-4d521fd32afa","Type":"ContainerStarted","Data":"2877896b0b93c0c68ff6dfc5c3a0aee8e7686040eb8f6a313c79904da3f4dc2b"} Oct 13 18:34:42 crc kubenswrapper[4974]: I1013 18:34:42.648853 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"46af3042-50b4-462e-9449-4d521fd32afa","Type":"ContainerStarted","Data":"b4b8a9a6d626473f330e131675e5d52b8a0c0ff82986f5fc3c352278ab2120b4"} Oct 13 18:34:42 crc kubenswrapper[4974]: I1013 18:34:42.650681 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354","Type":"ContainerStarted","Data":"e104dba199554ca6feabd50d58971ef89e30a0c61b14cd6be9f853b456e800c9"} Oct 13 18:34:42 crc kubenswrapper[4974]: I1013 18:34:42.650737 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354","Type":"ContainerStarted","Data":"2a634a2ded0284f5e2a68f3e048060c6d3ebd5938512a6c6faa9a3b74aaf388b"} Oct 13 18:34:42 crc kubenswrapper[4974]: I1013 18:34:42.650758 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"420e7e88-d552-4dd6-b5f8-b8ec9d8b9354","Type":"ContainerStarted","Data":"3c7d343560e673306d9f7e62b26c8a30f5dd67e4ab726143d854bd7ff9fb921b"} Oct 13 18:34:42 crc kubenswrapper[4974]: I1013 18:34:42.669486 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.669469279 podStartE2EDuration="2.669469279s" podCreationTimestamp="2025-10-13 18:34:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:34:42.666393243 +0000 UTC m=+1217.570759333" watchObservedRunningTime="2025-10-13 18:34:42.669469279 +0000 UTC m=+1217.573835379" Oct 13 18:34:42 crc kubenswrapper[4974]: I1013 18:34:42.708483 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.708458624 podStartE2EDuration="2.708458624s" podCreationTimestamp="2025-10-13 18:34:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:34:42.694028502 +0000 UTC m=+1217.598394582" watchObservedRunningTime="2025-10-13 18:34:42.708458624 +0000 UTC m=+1217.612824744" Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.681334 4974 generic.go:334] "Generic (PLEG): container finished" podID="bbb2ba38-77ca-4954-998c-ca99c34e0b3c" containerID="3e2285772d7905b9bda0126dc94d24d559183db00d7b16cfb8c16c8db10d82a4" exitCode=0 Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.681865 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbb2ba38-77ca-4954-998c-ca99c34e0b3c","Type":"ContainerDied","Data":"3e2285772d7905b9bda0126dc94d24d559183db00d7b16cfb8c16c8db10d82a4"} Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.681888 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"bbb2ba38-77ca-4954-998c-ca99c34e0b3c","Type":"ContainerDied","Data":"339487af903fd03e84072d9ebf8e1417d4ee4caee501a555b9ff3221897a07c6"} Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.681898 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="339487af903fd03e84072d9ebf8e1417d4ee4caee501a555b9ff3221897a07c6" Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.745968 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.889052 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-combined-ca-bundle\") pod \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\" (UID: \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\") " Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.889164 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qd9v\" (UniqueName: \"kubernetes.io/projected/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-kube-api-access-8qd9v\") pod \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\" (UID: \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\") " Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.889206 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-config-data\") pod \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\" (UID: \"bbb2ba38-77ca-4954-998c-ca99c34e0b3c\") " Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.894353 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-kube-api-access-8qd9v" (OuterVolumeSpecName: "kube-api-access-8qd9v") pod "bbb2ba38-77ca-4954-998c-ca99c34e0b3c" (UID: "bbb2ba38-77ca-4954-998c-ca99c34e0b3c"). InnerVolumeSpecName "kube-api-access-8qd9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.916719 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-config-data" (OuterVolumeSpecName: "config-data") pod "bbb2ba38-77ca-4954-998c-ca99c34e0b3c" (UID: "bbb2ba38-77ca-4954-998c-ca99c34e0b3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.926579 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bbb2ba38-77ca-4954-998c-ca99c34e0b3c" (UID: "bbb2ba38-77ca-4954-998c-ca99c34e0b3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.991785 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.991814 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qd9v\" (UniqueName: \"kubernetes.io/projected/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-kube-api-access-8qd9v\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:44 crc kubenswrapper[4974]: I1013 18:34:44.991825 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bbb2ba38-77ca-4954-998c-ca99c34e0b3c-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.696512 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.740076 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.763994 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.775109 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:34:45 crc kubenswrapper[4974]: E1013 18:34:45.775775 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbb2ba38-77ca-4954-998c-ca99c34e0b3c" containerName="nova-scheduler-scheduler" Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.775809 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbb2ba38-77ca-4954-998c-ca99c34e0b3c" containerName="nova-scheduler-scheduler" Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.776314 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbb2ba38-77ca-4954-998c-ca99c34e0b3c" containerName="nova-scheduler-scheduler" Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.777592 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.780390 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.791330 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.833773 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbb2ba38-77ca-4954-998c-ca99c34e0b3c" path="/var/lib/kubelet/pods/bbb2ba38-77ca-4954-998c-ca99c34e0b3c/volumes" Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.910632 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcfqn\" (UniqueName: \"kubernetes.io/projected/a5915d47-3416-4678-8589-31ea94154b54-kube-api-access-hcfqn\") pod \"nova-scheduler-0\" (UID: \"a5915d47-3416-4678-8589-31ea94154b54\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.910760 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5915d47-3416-4678-8589-31ea94154b54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a5915d47-3416-4678-8589-31ea94154b54\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:45 crc kubenswrapper[4974]: I1013 18:34:45.910844 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5915d47-3416-4678-8589-31ea94154b54-config-data\") pod \"nova-scheduler-0\" (UID: \"a5915d47-3416-4678-8589-31ea94154b54\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:46 crc kubenswrapper[4974]: I1013 18:34:46.012649 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5915d47-3416-4678-8589-31ea94154b54-config-data\") pod \"nova-scheduler-0\" (UID: \"a5915d47-3416-4678-8589-31ea94154b54\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:46 crc kubenswrapper[4974]: I1013 18:34:46.012924 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcfqn\" (UniqueName: \"kubernetes.io/projected/a5915d47-3416-4678-8589-31ea94154b54-kube-api-access-hcfqn\") pod \"nova-scheduler-0\" (UID: \"a5915d47-3416-4678-8589-31ea94154b54\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:46 crc kubenswrapper[4974]: I1013 18:34:46.013003 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5915d47-3416-4678-8589-31ea94154b54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a5915d47-3416-4678-8589-31ea94154b54\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:46 crc kubenswrapper[4974]: I1013 18:34:46.021569 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5915d47-3416-4678-8589-31ea94154b54-config-data\") pod \"nova-scheduler-0\" (UID: \"a5915d47-3416-4678-8589-31ea94154b54\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:46 crc kubenswrapper[4974]: I1013 18:34:46.027322 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5915d47-3416-4678-8589-31ea94154b54-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"a5915d47-3416-4678-8589-31ea94154b54\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:46 crc kubenswrapper[4974]: I1013 18:34:46.034865 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcfqn\" (UniqueName: \"kubernetes.io/projected/a5915d47-3416-4678-8589-31ea94154b54-kube-api-access-hcfqn\") pod \"nova-scheduler-0\" (UID: \"a5915d47-3416-4678-8589-31ea94154b54\") " pod="openstack/nova-scheduler-0" Oct 13 18:34:46 crc kubenswrapper[4974]: I1013 18:34:46.038489 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 18:34:46 crc kubenswrapper[4974]: I1013 18:34:46.038840 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Oct 13 18:34:46 crc kubenswrapper[4974]: I1013 18:34:46.106412 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Oct 13 18:34:46 crc kubenswrapper[4974]: I1013 18:34:46.598395 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Oct 13 18:34:46 crc kubenswrapper[4974]: I1013 18:34:46.709157 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5915d47-3416-4678-8589-31ea94154b54","Type":"ContainerStarted","Data":"08399de8edc0394adf9ca147a72addae0e4d6a27e2093e1e5b6ed75565853c71"} Oct 13 18:34:47 crc kubenswrapper[4974]: I1013 18:34:47.720474 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"a5915d47-3416-4678-8589-31ea94154b54","Type":"ContainerStarted","Data":"bf6ec2b5820b6a62ae7ddb8538cb1a0089063ffabd34e37d2589cb6514383da8"} Oct 13 18:34:47 crc kubenswrapper[4974]: I1013 18:34:47.741012 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.7409911019999997 podStartE2EDuration="2.740991102s" podCreationTimestamp="2025-10-13 18:34:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:34:47.73444524 +0000 UTC m=+1222.638811360" watchObservedRunningTime="2025-10-13 18:34:47.740991102 +0000 UTC m=+1222.645357192" Oct 13 18:34:51 crc kubenswrapper[4974]: I1013 18:34:51.037994 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 18:34:51 crc kubenswrapper[4974]: I1013 18:34:51.038848 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Oct 13 18:34:51 crc kubenswrapper[4974]: I1013 18:34:51.107205 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Oct 13 18:34:51 crc kubenswrapper[4974]: I1013 18:34:51.130935 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 18:34:51 crc kubenswrapper[4974]: I1013 18:34:51.130990 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Oct 13 18:34:52 crc kubenswrapper[4974]: I1013 18:34:52.051879 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="46af3042-50b4-462e-9449-4d521fd32afa" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 18:34:52 crc kubenswrapper[4974]: I1013 18:34:52.051903 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="46af3042-50b4-462e-9449-4d521fd32afa" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.230:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 18:34:52 crc kubenswrapper[4974]: I1013 18:34:52.145812 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="420e7e88-d552-4dd6-b5f8-b8ec9d8b9354" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 18:34:52 crc kubenswrapper[4974]: I1013 18:34:52.145855 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="420e7e88-d552-4dd6-b5f8-b8ec9d8b9354" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Oct 13 18:34:56 crc kubenswrapper[4974]: I1013 18:34:56.107880 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Oct 13 18:34:56 crc kubenswrapper[4974]: I1013 18:34:56.160476 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Oct 13 18:34:56 crc kubenswrapper[4974]: I1013 18:34:56.872171 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Oct 13 18:35:01 crc kubenswrapper[4974]: I1013 18:35:01.045172 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 18:35:01 crc kubenswrapper[4974]: I1013 18:35:01.045838 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Oct 13 18:35:01 crc kubenswrapper[4974]: I1013 18:35:01.055569 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 18:35:01 crc kubenswrapper[4974]: I1013 18:35:01.057796 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Oct 13 18:35:01 crc kubenswrapper[4974]: I1013 18:35:01.162184 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 18:35:01 crc kubenswrapper[4974]: I1013 18:35:01.162670 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 18:35:01 crc kubenswrapper[4974]: I1013 18:35:01.165391 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Oct 13 18:35:01 crc kubenswrapper[4974]: I1013 18:35:01.180164 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 18:35:01 crc kubenswrapper[4974]: I1013 18:35:01.878351 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Oct 13 18:35:01 crc kubenswrapper[4974]: I1013 18:35:01.890045 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Oct 13 18:35:02 crc kubenswrapper[4974]: I1013 18:35:02.950968 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Oct 13 18:35:07 crc kubenswrapper[4974]: I1013 18:35:07.743504 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:35:07 crc kubenswrapper[4974]: I1013 18:35:07.744211 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:35:12 crc kubenswrapper[4974]: I1013 18:35:12.573729 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 18:35:13 crc kubenswrapper[4974]: I1013 18:35:13.661797 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 18:35:15 crc kubenswrapper[4974]: I1013 18:35:15.947627 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="97f813f5-f34c-4b82-b066-032f8b795049" containerName="rabbitmq" containerID="cri-o://08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241" gracePeriod=604797 Oct 13 18:35:16 crc kubenswrapper[4974]: I1013 18:35:16.581539 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" containerName="rabbitmq" containerID="cri-o://f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9" gracePeriod=604798 Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.563463 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.728161 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97f813f5-f34c-4b82-b066-032f8b795049-pod-info\") pod \"97f813f5-f34c-4b82-b066-032f8b795049\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.728219 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-plugins\") pod \"97f813f5-f34c-4b82-b066-032f8b795049\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.728252 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrl8p\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-kube-api-access-lrl8p\") pod \"97f813f5-f34c-4b82-b066-032f8b795049\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.728306 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-config-data\") pod \"97f813f5-f34c-4b82-b066-032f8b795049\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.728353 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-server-conf\") pod \"97f813f5-f34c-4b82-b066-032f8b795049\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.728402 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-confd\") pod \"97f813f5-f34c-4b82-b066-032f8b795049\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.728417 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"97f813f5-f34c-4b82-b066-032f8b795049\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.728437 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-erlang-cookie\") pod \"97f813f5-f34c-4b82-b066-032f8b795049\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.728483 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-plugins-conf\") pod \"97f813f5-f34c-4b82-b066-032f8b795049\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.728540 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97f813f5-f34c-4b82-b066-032f8b795049-erlang-cookie-secret\") pod \"97f813f5-f34c-4b82-b066-032f8b795049\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.728579 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-tls\") pod \"97f813f5-f34c-4b82-b066-032f8b795049\" (UID: \"97f813f5-f34c-4b82-b066-032f8b795049\") " Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.730690 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "97f813f5-f34c-4b82-b066-032f8b795049" (UID: "97f813f5-f34c-4b82-b066-032f8b795049"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.736304 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "97f813f5-f34c-4b82-b066-032f8b795049" (UID: "97f813f5-f34c-4b82-b066-032f8b795049"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.736407 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "97f813f5-f34c-4b82-b066-032f8b795049" (UID: "97f813f5-f34c-4b82-b066-032f8b795049"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.739949 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/97f813f5-f34c-4b82-b066-032f8b795049-pod-info" (OuterVolumeSpecName: "pod-info") pod "97f813f5-f34c-4b82-b066-032f8b795049" (UID: "97f813f5-f34c-4b82-b066-032f8b795049"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.744319 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-kube-api-access-lrl8p" (OuterVolumeSpecName: "kube-api-access-lrl8p") pod "97f813f5-f34c-4b82-b066-032f8b795049" (UID: "97f813f5-f34c-4b82-b066-032f8b795049"). InnerVolumeSpecName "kube-api-access-lrl8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.752139 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "97f813f5-f34c-4b82-b066-032f8b795049" (UID: "97f813f5-f34c-4b82-b066-032f8b795049"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.764176 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "97f813f5-f34c-4b82-b066-032f8b795049" (UID: "97f813f5-f34c-4b82-b066-032f8b795049"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.767051 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97f813f5-f34c-4b82-b066-032f8b795049-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "97f813f5-f34c-4b82-b066-032f8b795049" (UID: "97f813f5-f34c-4b82-b066-032f8b795049"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.768537 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-config-data" (OuterVolumeSpecName: "config-data") pod "97f813f5-f34c-4b82-b066-032f8b795049" (UID: "97f813f5-f34c-4b82-b066-032f8b795049"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.830216 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.830262 4974 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.830273 4974 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.830283 4974 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.830292 4974 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/97f813f5-f34c-4b82-b066-032f8b795049-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.830300 4974 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.830307 4974 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/97f813f5-f34c-4b82-b066-032f8b795049-pod-info\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.830315 4974 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.830323 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrl8p\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-kube-api-access-lrl8p\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.830851 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-server-conf" (OuterVolumeSpecName: "server-conf") pod "97f813f5-f34c-4b82-b066-032f8b795049" (UID: "97f813f5-f34c-4b82-b066-032f8b795049"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.862369 4974 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.920848 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "97f813f5-f34c-4b82-b066-032f8b795049" (UID: "97f813f5-f34c-4b82-b066-032f8b795049"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.932128 4974 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/97f813f5-f34c-4b82-b066-032f8b795049-server-conf\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.932147 4974 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/97f813f5-f34c-4b82-b066-032f8b795049-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:17 crc kubenswrapper[4974]: I1013 18:35:17.932157 4974 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.081639 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.096767 4974 generic.go:334] "Generic (PLEG): container finished" podID="97f813f5-f34c-4b82-b066-032f8b795049" containerID="08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241" exitCode=0 Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.096828 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97f813f5-f34c-4b82-b066-032f8b795049","Type":"ContainerDied","Data":"08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241"} Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.096856 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"97f813f5-f34c-4b82-b066-032f8b795049","Type":"ContainerDied","Data":"00c9001668603c5a423d64f2b2e013ac2195c48e85bd1d24d991cc31dd729db9"} Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.096876 4974 scope.go:117] "RemoveContainer" containerID="08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.097115 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.098745 4974 generic.go:334] "Generic (PLEG): container finished" podID="a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" containerID="f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9" exitCode=0 Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.098763 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f","Type":"ContainerDied","Data":"f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9"} Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.098778 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f","Type":"ContainerDied","Data":"f1f13a63f8c6e3d47b4841e911f465ed65f9664104e5af973100b4f6699109e9"} Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.098830 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.147293 4974 scope.go:117] "RemoveContainer" containerID="9329d18d7ac353654a37f475ebbc3089790ec57af55580074f0df82cc9e8e5d2" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.160976 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.171122 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.195859 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 18:35:18 crc kubenswrapper[4974]: E1013 18:35:18.196515 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f813f5-f34c-4b82-b066-032f8b795049" containerName="rabbitmq" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.196576 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f813f5-f34c-4b82-b066-032f8b795049" containerName="rabbitmq" Oct 13 18:35:18 crc kubenswrapper[4974]: E1013 18:35:18.196588 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97f813f5-f34c-4b82-b066-032f8b795049" containerName="setup-container" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.196596 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="97f813f5-f34c-4b82-b066-032f8b795049" containerName="setup-container" Oct 13 18:35:18 crc kubenswrapper[4974]: E1013 18:35:18.196608 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" containerName="rabbitmq" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.196636 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" containerName="rabbitmq" Oct 13 18:35:18 crc kubenswrapper[4974]: E1013 18:35:18.196689 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" containerName="setup-container" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.196699 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" containerName="setup-container" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.197034 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" containerName="rabbitmq" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.197086 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="97f813f5-f34c-4b82-b066-032f8b795049" containerName="rabbitmq" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.198445 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.201084 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-m5zdm" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.206194 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.206344 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.206443 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.206630 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.206765 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.207341 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.213543 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.237394 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-plugins-conf\") pod \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.237447 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr2wt\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-kube-api-access-xr2wt\") pod \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.237474 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-server-conf\") pod \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.237612 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-erlang-cookie-secret\") pod \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.237699 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-config-data\") pod \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.237768 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.237793 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-tls\") pod \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.237830 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-pod-info\") pod \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.237859 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-erlang-cookie\") pod \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.237876 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-confd\") pod \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.237899 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-plugins\") pod \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\" (UID: \"a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f\") " Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.238878 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" (UID: "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.244051 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" (UID: "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.246772 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" (UID: "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.247166 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" (UID: "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.248842 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" (UID: "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.249223 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-kube-api-access-xr2wt" (OuterVolumeSpecName: "kube-api-access-xr2wt") pod "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" (UID: "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f"). InnerVolumeSpecName "kube-api-access-xr2wt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.249485 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" (UID: "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.255259 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-pod-info" (OuterVolumeSpecName: "pod-info") pod "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" (UID: "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.279211 4974 scope.go:117] "RemoveContainer" containerID="08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241" Oct 13 18:35:18 crc kubenswrapper[4974]: E1013 18:35:18.281211 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241\": container with ID starting with 08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241 not found: ID does not exist" containerID="08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.281241 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241"} err="failed to get container status \"08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241\": rpc error: code = NotFound desc = could not find container \"08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241\": container with ID starting with 08820c7e69a46aa47f1c9c759eeebeba39d4076b99080b3b0196395bb44e5241 not found: ID does not exist" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.281260 4974 scope.go:117] "RemoveContainer" containerID="9329d18d7ac353654a37f475ebbc3089790ec57af55580074f0df82cc9e8e5d2" Oct 13 18:35:18 crc kubenswrapper[4974]: E1013 18:35:18.283305 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9329d18d7ac353654a37f475ebbc3089790ec57af55580074f0df82cc9e8e5d2\": container with ID starting with 9329d18d7ac353654a37f475ebbc3089790ec57af55580074f0df82cc9e8e5d2 not found: ID does not exist" containerID="9329d18d7ac353654a37f475ebbc3089790ec57af55580074f0df82cc9e8e5d2" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.283338 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9329d18d7ac353654a37f475ebbc3089790ec57af55580074f0df82cc9e8e5d2"} err="failed to get container status \"9329d18d7ac353654a37f475ebbc3089790ec57af55580074f0df82cc9e8e5d2\": rpc error: code = NotFound desc = could not find container \"9329d18d7ac353654a37f475ebbc3089790ec57af55580074f0df82cc9e8e5d2\": container with ID starting with 9329d18d7ac353654a37f475ebbc3089790ec57af55580074f0df82cc9e8e5d2 not found: ID does not exist" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.283360 4974 scope.go:117] "RemoveContainer" containerID="f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.296863 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-config-data" (OuterVolumeSpecName: "config-data") pod "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" (UID: "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.333803 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-server-conf" (OuterVolumeSpecName: "server-conf") pod "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" (UID: "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.347907 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.348780 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4b2e987b-fd90-420d-86f1-b9757dd40b03-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.349016 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4b2e987b-fd90-420d-86f1-b9757dd40b03-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.349131 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt2ng\" (UniqueName: \"kubernetes.io/projected/4b2e987b-fd90-420d-86f1-b9757dd40b03-kube-api-access-jt2ng\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.349268 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4b2e987b-fd90-420d-86f1-b9757dd40b03-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.350567 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4b2e987b-fd90-420d-86f1-b9757dd40b03-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.350836 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4b2e987b-fd90-420d-86f1-b9757dd40b03-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.351017 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4b2e987b-fd90-420d-86f1-b9757dd40b03-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.351141 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4b2e987b-fd90-420d-86f1-b9757dd40b03-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.351261 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b2e987b-fd90-420d-86f1-b9757dd40b03-config-data\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.351516 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4b2e987b-fd90-420d-86f1-b9757dd40b03-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.351696 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.351803 4974 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.351892 4974 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.351994 4974 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-pod-info\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.352087 4974 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.352180 4974 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.352283 4974 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-plugins-conf\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.352377 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xr2wt\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-kube-api-access-xr2wt\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.352464 4974 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-server-conf\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.352559 4974 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.415223 4974 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.423789 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" (UID: "a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.454596 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.454852 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4b2e987b-fd90-420d-86f1-b9757dd40b03-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.454950 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4b2e987b-fd90-420d-86f1-b9757dd40b03-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.455018 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt2ng\" (UniqueName: \"kubernetes.io/projected/4b2e987b-fd90-420d-86f1-b9757dd40b03-kube-api-access-jt2ng\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.455083 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4b2e987b-fd90-420d-86f1-b9757dd40b03-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.455159 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4b2e987b-fd90-420d-86f1-b9757dd40b03-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.455247 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4b2e987b-fd90-420d-86f1-b9757dd40b03-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.455329 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4b2e987b-fd90-420d-86f1-b9757dd40b03-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.455392 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4b2e987b-fd90-420d-86f1-b9757dd40b03-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.455455 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b2e987b-fd90-420d-86f1-b9757dd40b03-config-data\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.455558 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4b2e987b-fd90-420d-86f1-b9757dd40b03-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.455670 4974 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.455730 4974 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.456464 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4b2e987b-fd90-420d-86f1-b9757dd40b03-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.456574 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.458079 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4b2e987b-fd90-420d-86f1-b9757dd40b03-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.459354 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4b2e987b-fd90-420d-86f1-b9757dd40b03-config-data\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.461392 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4b2e987b-fd90-420d-86f1-b9757dd40b03-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.461935 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4b2e987b-fd90-420d-86f1-b9757dd40b03-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.462124 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4b2e987b-fd90-420d-86f1-b9757dd40b03-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.466001 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4b2e987b-fd90-420d-86f1-b9757dd40b03-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.466304 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4b2e987b-fd90-420d-86f1-b9757dd40b03-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.473232 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4b2e987b-fd90-420d-86f1-b9757dd40b03-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.496835 4974 scope.go:117] "RemoveContainer" containerID="f9feccc6065d63f691d7fbab7f9eab65f08587127c273cffc84fd6e683cab7ec" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.497932 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt2ng\" (UniqueName: \"kubernetes.io/projected/4b2e987b-fd90-420d-86f1-b9757dd40b03-kube-api-access-jt2ng\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.555297 4974 scope.go:117] "RemoveContainer" containerID="f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9" Oct 13 18:35:18 crc kubenswrapper[4974]: E1013 18:35:18.558754 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9\": container with ID starting with f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9 not found: ID does not exist" containerID="f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.558917 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9"} err="failed to get container status \"f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9\": rpc error: code = NotFound desc = could not find container \"f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9\": container with ID starting with f3f9c25cad9a96746676d5e1a815a9f20ee399f2b11adf18689127cdb69353b9 not found: ID does not exist" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.559001 4974 scope.go:117] "RemoveContainer" containerID="f9feccc6065d63f691d7fbab7f9eab65f08587127c273cffc84fd6e683cab7ec" Oct 13 18:35:18 crc kubenswrapper[4974]: E1013 18:35:18.559602 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9feccc6065d63f691d7fbab7f9eab65f08587127c273cffc84fd6e683cab7ec\": container with ID starting with f9feccc6065d63f691d7fbab7f9eab65f08587127c273cffc84fd6e683cab7ec not found: ID does not exist" containerID="f9feccc6065d63f691d7fbab7f9eab65f08587127c273cffc84fd6e683cab7ec" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.559642 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9feccc6065d63f691d7fbab7f9eab65f08587127c273cffc84fd6e683cab7ec"} err="failed to get container status \"f9feccc6065d63f691d7fbab7f9eab65f08587127c273cffc84fd6e683cab7ec\": rpc error: code = NotFound desc = could not find container \"f9feccc6065d63f691d7fbab7f9eab65f08587127c273cffc84fd6e683cab7ec\": container with ID starting with f9feccc6065d63f691d7fbab7f9eab65f08587127c273cffc84fd6e683cab7ec not found: ID does not exist" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.574937 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"4b2e987b-fd90-420d-86f1-b9757dd40b03\") " pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.735939 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.753178 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.761532 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.763055 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.766987 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.767171 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.767550 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.767959 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.768209 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.768368 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-wjncs" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.768727 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.775842 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.799163 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.862155 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.862220 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.862239 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.862263 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.862284 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.862326 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.862395 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.862563 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nlm5\" (UniqueName: \"kubernetes.io/projected/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-kube-api-access-4nlm5\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.862632 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.862710 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.862835 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.964857 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.965262 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.965291 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.965323 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.965353 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.965411 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.965467 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.965525 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nlm5\" (UniqueName: \"kubernetes.io/projected/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-kube-api-access-4nlm5\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.965559 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.965588 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.965731 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.966383 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.966469 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.967507 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.968275 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.968367 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.968725 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.975530 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.976284 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.983375 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:18 crc kubenswrapper[4974]: I1013 18:35:18.990159 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nlm5\" (UniqueName: \"kubernetes.io/projected/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-kube-api-access-4nlm5\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:19 crc kubenswrapper[4974]: I1013 18:35:19.000237 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4bf0b2fe-061e-486f-9e0f-96bd13bc7eae-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:19 crc kubenswrapper[4974]: I1013 18:35:19.014857 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae\") " pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:19 crc kubenswrapper[4974]: I1013 18:35:19.083469 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:19 crc kubenswrapper[4974]: I1013 18:35:19.251676 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Oct 13 18:35:19 crc kubenswrapper[4974]: I1013 18:35:19.591463 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Oct 13 18:35:19 crc kubenswrapper[4974]: I1013 18:35:19.827449 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97f813f5-f34c-4b82-b066-032f8b795049" path="/var/lib/kubelet/pods/97f813f5-f34c-4b82-b066-032f8b795049/volumes" Oct 13 18:35:19 crc kubenswrapper[4974]: I1013 18:35:19.828502 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f" path="/var/lib/kubelet/pods/a0d4fdc3-bd26-4379-8dfd-f6f3ce70a24f/volumes" Oct 13 18:35:20 crc kubenswrapper[4974]: I1013 18:35:20.132667 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae","Type":"ContainerStarted","Data":"05b7ae558333bc3a03cae5159c75a5492c7b8107817f2922778836ee5333bbab"} Oct 13 18:35:20 crc kubenswrapper[4974]: I1013 18:35:20.134416 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4b2e987b-fd90-420d-86f1-b9757dd40b03","Type":"ContainerStarted","Data":"df563a32049aaff767f2bb8910613914ca9e5a86129623d3c8080d262ed5e94e"} Oct 13 18:35:22 crc kubenswrapper[4974]: I1013 18:35:22.156894 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae","Type":"ContainerStarted","Data":"a678c515069869aea47392e1cc130858a79988a69b7730d9980d5a7bff1a9cd6"} Oct 13 18:35:22 crc kubenswrapper[4974]: I1013 18:35:22.159436 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4b2e987b-fd90-420d-86f1-b9757dd40b03","Type":"ContainerStarted","Data":"7dd35c40f33001c947258a42e9e7856ee294a6c46ec8626732702ccb351c57c0"} Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.229098 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-845969cbff-sk7vl"] Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.231766 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.241446 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.252991 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845969cbff-sk7vl"] Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.431511 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-dns-svc\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.432888 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-ovsdbserver-sb\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.433133 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-dns-swift-storage-0\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.433348 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6dgd\" (UniqueName: \"kubernetes.io/projected/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-kube-api-access-x6dgd\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.433447 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-config\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.433503 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-ovsdbserver-nb\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.433580 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-openstack-edpm-ipam\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.535093 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-config\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.535458 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-ovsdbserver-nb\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.535528 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-openstack-edpm-ipam\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.535566 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-dns-svc\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.535595 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-ovsdbserver-sb\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.535683 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-dns-swift-storage-0\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.535742 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6dgd\" (UniqueName: \"kubernetes.io/projected/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-kube-api-access-x6dgd\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.536417 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-config\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.536579 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-openstack-edpm-ipam\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.536826 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-dns-svc\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.536930 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-ovsdbserver-sb\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.536949 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-ovsdbserver-nb\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.537937 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-dns-swift-storage-0\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.563816 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6dgd\" (UniqueName: \"kubernetes.io/projected/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-kube-api-access-x6dgd\") pod \"dnsmasq-dns-845969cbff-sk7vl\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:26 crc kubenswrapper[4974]: I1013 18:35:26.851045 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:27 crc kubenswrapper[4974]: I1013 18:35:27.413560 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-845969cbff-sk7vl"] Oct 13 18:35:27 crc kubenswrapper[4974]: W1013 18:35:27.418415 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2d915c2_0d65_4a45_9f58_1c5764d5dca2.slice/crio-2280b70f6573499157d095334c5f4d87c56a031720dfce4fe9c153b36175ac6c WatchSource:0}: Error finding container 2280b70f6573499157d095334c5f4d87c56a031720dfce4fe9c153b36175ac6c: Status 404 returned error can't find the container with id 2280b70f6573499157d095334c5f4d87c56a031720dfce4fe9c153b36175ac6c Oct 13 18:35:28 crc kubenswrapper[4974]: I1013 18:35:28.275638 4974 generic.go:334] "Generic (PLEG): container finished" podID="a2d915c2-0d65-4a45-9f58-1c5764d5dca2" containerID="96e037399ac33828f113226f332c98bacd19fa0e3bdd4222792ee48e0872c183" exitCode=0 Oct 13 18:35:28 crc kubenswrapper[4974]: I1013 18:35:28.275760 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" event={"ID":"a2d915c2-0d65-4a45-9f58-1c5764d5dca2","Type":"ContainerDied","Data":"96e037399ac33828f113226f332c98bacd19fa0e3bdd4222792ee48e0872c183"} Oct 13 18:35:28 crc kubenswrapper[4974]: I1013 18:35:28.276043 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" event={"ID":"a2d915c2-0d65-4a45-9f58-1c5764d5dca2","Type":"ContainerStarted","Data":"2280b70f6573499157d095334c5f4d87c56a031720dfce4fe9c153b36175ac6c"} Oct 13 18:35:29 crc kubenswrapper[4974]: I1013 18:35:29.292644 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" event={"ID":"a2d915c2-0d65-4a45-9f58-1c5764d5dca2","Type":"ContainerStarted","Data":"7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc"} Oct 13 18:35:29 crc kubenswrapper[4974]: I1013 18:35:29.293193 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:29 crc kubenswrapper[4974]: I1013 18:35:29.321336 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" podStartSLOduration=3.321305023 podStartE2EDuration="3.321305023s" podCreationTimestamp="2025-10-13 18:35:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:35:29.313757203 +0000 UTC m=+1264.218123343" watchObservedRunningTime="2025-10-13 18:35:29.321305023 +0000 UTC m=+1264.225671143" Oct 13 18:35:36 crc kubenswrapper[4974]: I1013 18:35:36.852957 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:36 crc kubenswrapper[4974]: I1013 18:35:36.934924 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d8dc5545f-zxwtd"] Oct 13 18:35:36 crc kubenswrapper[4974]: I1013 18:35:36.936084 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" podUID="c00496a5-8aaa-48e0-a156-31a35d2299bf" containerName="dnsmasq-dns" containerID="cri-o://ef537a173a2dc3838e49a4b1626f35165d15576ca6e9f530b590d4e5f8709072" gracePeriod=10 Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.229446 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d59b7cdcf-mbsgm"] Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.232673 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.273845 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d59b7cdcf-mbsgm"] Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.402210 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.402273 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kbzx\" (UniqueName: \"kubernetes.io/projected/83dacb6d-48a4-400a-9edb-74a61b3bf83f-kube-api-access-2kbzx\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.402293 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.402319 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-dns-svc\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.402353 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.402392 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.402414 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-config\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.403518 4974 generic.go:334] "Generic (PLEG): container finished" podID="c00496a5-8aaa-48e0-a156-31a35d2299bf" containerID="ef537a173a2dc3838e49a4b1626f35165d15576ca6e9f530b590d4e5f8709072" exitCode=0 Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.403547 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" event={"ID":"c00496a5-8aaa-48e0-a156-31a35d2299bf","Type":"ContainerDied","Data":"ef537a173a2dc3838e49a4b1626f35165d15576ca6e9f530b590d4e5f8709072"} Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.508188 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.508247 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-config\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.508368 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.508419 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kbzx\" (UniqueName: \"kubernetes.io/projected/83dacb6d-48a4-400a-9edb-74a61b3bf83f-kube-api-access-2kbzx\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.508448 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.508479 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-dns-svc\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.508523 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.510726 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-dns-swift-storage-0\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.519499 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-dns-svc\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.519563 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.520623 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-ovsdbserver-nb\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.521395 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-config\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.524016 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/83dacb6d-48a4-400a-9edb-74a61b3bf83f-ovsdbserver-sb\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.546004 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kbzx\" (UniqueName: \"kubernetes.io/projected/83dacb6d-48a4-400a-9edb-74a61b3bf83f-kube-api-access-2kbzx\") pod \"dnsmasq-dns-7d59b7cdcf-mbsgm\" (UID: \"83dacb6d-48a4-400a-9edb-74a61b3bf83f\") " pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.552487 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.673928 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.743116 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.743185 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.743231 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.744050 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5a8943cc34f896ee3b60b9837ff3add67567a5a52b9b5a1adbb600a9ed07e274"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.744094 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://5a8943cc34f896ee3b60b9837ff3add67567a5a52b9b5a1adbb600a9ed07e274" gracePeriod=600 Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.819583 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-svc\") pod \"c00496a5-8aaa-48e0-a156-31a35d2299bf\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.820474 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-config\") pod \"c00496a5-8aaa-48e0-a156-31a35d2299bf\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.820752 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-ovsdbserver-nb\") pod \"c00496a5-8aaa-48e0-a156-31a35d2299bf\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.820882 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-ovsdbserver-sb\") pod \"c00496a5-8aaa-48e0-a156-31a35d2299bf\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.820988 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-295r4\" (UniqueName: \"kubernetes.io/projected/c00496a5-8aaa-48e0-a156-31a35d2299bf-kube-api-access-295r4\") pod \"c00496a5-8aaa-48e0-a156-31a35d2299bf\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.821084 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-swift-storage-0\") pod \"c00496a5-8aaa-48e0-a156-31a35d2299bf\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.826028 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c00496a5-8aaa-48e0-a156-31a35d2299bf-kube-api-access-295r4" (OuterVolumeSpecName: "kube-api-access-295r4") pod "c00496a5-8aaa-48e0-a156-31a35d2299bf" (UID: "c00496a5-8aaa-48e0-a156-31a35d2299bf"). InnerVolumeSpecName "kube-api-access-295r4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.899846 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c00496a5-8aaa-48e0-a156-31a35d2299bf" (UID: "c00496a5-8aaa-48e0-a156-31a35d2299bf"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.922726 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c00496a5-8aaa-48e0-a156-31a35d2299bf" (UID: "c00496a5-8aaa-48e0-a156-31a35d2299bf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.922803 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-swift-storage-0\") pod \"c00496a5-8aaa-48e0-a156-31a35d2299bf\" (UID: \"c00496a5-8aaa-48e0-a156-31a35d2299bf\") " Oct 13 18:35:37 crc kubenswrapper[4974]: W1013 18:35:37.922905 4974 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/c00496a5-8aaa-48e0-a156-31a35d2299bf/volumes/kubernetes.io~configmap/dns-swift-storage-0 Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.922915 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c00496a5-8aaa-48e0-a156-31a35d2299bf" (UID: "c00496a5-8aaa-48e0-a156-31a35d2299bf"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.923569 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-295r4\" (UniqueName: \"kubernetes.io/projected/c00496a5-8aaa-48e0-a156-31a35d2299bf-kube-api-access-295r4\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.923582 4974 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.923591 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.944067 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-config" (OuterVolumeSpecName: "config") pod "c00496a5-8aaa-48e0-a156-31a35d2299bf" (UID: "c00496a5-8aaa-48e0-a156-31a35d2299bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.950220 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c00496a5-8aaa-48e0-a156-31a35d2299bf" (UID: "c00496a5-8aaa-48e0-a156-31a35d2299bf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:37 crc kubenswrapper[4974]: I1013 18:35:37.959778 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c00496a5-8aaa-48e0-a156-31a35d2299bf" (UID: "c00496a5-8aaa-48e0-a156-31a35d2299bf"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.025524 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.025566 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.025576 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c00496a5-8aaa-48e0-a156-31a35d2299bf-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:38 crc kubenswrapper[4974]: W1013 18:35:38.060751 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83dacb6d_48a4_400a_9edb_74a61b3bf83f.slice/crio-fbe6aae3c9434de5360aef3cb58eb47d20f9942fc36da3dbf1533a9418ecd90d WatchSource:0}: Error finding container fbe6aae3c9434de5360aef3cb58eb47d20f9942fc36da3dbf1533a9418ecd90d: Status 404 returned error can't find the container with id fbe6aae3c9434de5360aef3cb58eb47d20f9942fc36da3dbf1533a9418ecd90d Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.067268 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d59b7cdcf-mbsgm"] Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.419865 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="5a8943cc34f896ee3b60b9837ff3add67567a5a52b9b5a1adbb600a9ed07e274" exitCode=0 Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.419929 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"5a8943cc34f896ee3b60b9837ff3add67567a5a52b9b5a1adbb600a9ed07e274"} Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.419960 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"ea39e73c9d324740300df4165ffa4e49cea26ef15903575dd3e89904e7bd3e53"} Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.419976 4974 scope.go:117] "RemoveContainer" containerID="171b1edc0ad6306edaf67441f6ae19fb0da0e0db23e98eea0abca2248299e8ae" Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.424613 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" event={"ID":"c00496a5-8aaa-48e0-a156-31a35d2299bf","Type":"ContainerDied","Data":"55dbf4046374e846074c0a5a1d86b4e73c4468cd160b34f8f9ced60e1281900e"} Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.424631 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d8dc5545f-zxwtd" Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.428026 4974 generic.go:334] "Generic (PLEG): container finished" podID="83dacb6d-48a4-400a-9edb-74a61b3bf83f" containerID="72c5ed431bb398d5d25df267395a782494e03402977261c1650ed4a0c504ab5d" exitCode=0 Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.428079 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" event={"ID":"83dacb6d-48a4-400a-9edb-74a61b3bf83f","Type":"ContainerDied","Data":"72c5ed431bb398d5d25df267395a782494e03402977261c1650ed4a0c504ab5d"} Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.428112 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" event={"ID":"83dacb6d-48a4-400a-9edb-74a61b3bf83f","Type":"ContainerStarted","Data":"fbe6aae3c9434de5360aef3cb58eb47d20f9942fc36da3dbf1533a9418ecd90d"} Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.481045 4974 scope.go:117] "RemoveContainer" containerID="ef537a173a2dc3838e49a4b1626f35165d15576ca6e9f530b590d4e5f8709072" Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.554139 4974 scope.go:117] "RemoveContainer" containerID="55163cfaa7e18ac43af395568c0b420dbc9d6ffbbb2bfd7be356befc68c405c9" Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.567752 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d8dc5545f-zxwtd"] Oct 13 18:35:38 crc kubenswrapper[4974]: I1013 18:35:38.580380 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d8dc5545f-zxwtd"] Oct 13 18:35:39 crc kubenswrapper[4974]: I1013 18:35:39.461341 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" event={"ID":"83dacb6d-48a4-400a-9edb-74a61b3bf83f","Type":"ContainerStarted","Data":"c7fb21e0af660607225b4c4943d96280920190ace10420af71fc7ece2322b062"} Oct 13 18:35:39 crc kubenswrapper[4974]: I1013 18:35:39.484931 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" podStartSLOduration=2.484914471 podStartE2EDuration="2.484914471s" podCreationTimestamp="2025-10-13 18:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:35:39.481086845 +0000 UTC m=+1274.385452935" watchObservedRunningTime="2025-10-13 18:35:39.484914471 +0000 UTC m=+1274.389280551" Oct 13 18:35:39 crc kubenswrapper[4974]: I1013 18:35:39.824738 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c00496a5-8aaa-48e0-a156-31a35d2299bf" path="/var/lib/kubelet/pods/c00496a5-8aaa-48e0-a156-31a35d2299bf/volumes" Oct 13 18:35:40 crc kubenswrapper[4974]: I1013 18:35:40.482418 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:47 crc kubenswrapper[4974]: I1013 18:35:47.554862 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d59b7cdcf-mbsgm" Oct 13 18:35:47 crc kubenswrapper[4974]: I1013 18:35:47.669873 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845969cbff-sk7vl"] Oct 13 18:35:47 crc kubenswrapper[4974]: I1013 18:35:47.670437 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" podUID="a2d915c2-0d65-4a45-9f58-1c5764d5dca2" containerName="dnsmasq-dns" containerID="cri-o://7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc" gracePeriod=10 Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.237221 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.404495 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-dns-svc\") pod \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.404647 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-dns-swift-storage-0\") pod \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.404726 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6dgd\" (UniqueName: \"kubernetes.io/projected/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-kube-api-access-x6dgd\") pod \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.404801 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-config\") pod \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.405180 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-openstack-edpm-ipam\") pod \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.405216 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-ovsdbserver-sb\") pod \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.405332 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-ovsdbserver-nb\") pod \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\" (UID: \"a2d915c2-0d65-4a45-9f58-1c5764d5dca2\") " Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.421997 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-kube-api-access-x6dgd" (OuterVolumeSpecName: "kube-api-access-x6dgd") pod "a2d915c2-0d65-4a45-9f58-1c5764d5dca2" (UID: "a2d915c2-0d65-4a45-9f58-1c5764d5dca2"). InnerVolumeSpecName "kube-api-access-x6dgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.471008 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a2d915c2-0d65-4a45-9f58-1c5764d5dca2" (UID: "a2d915c2-0d65-4a45-9f58-1c5764d5dca2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.473575 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a2d915c2-0d65-4a45-9f58-1c5764d5dca2" (UID: "a2d915c2-0d65-4a45-9f58-1c5764d5dca2"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.476360 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-config" (OuterVolumeSpecName: "config") pod "a2d915c2-0d65-4a45-9f58-1c5764d5dca2" (UID: "a2d915c2-0d65-4a45-9f58-1c5764d5dca2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.480771 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a2d915c2-0d65-4a45-9f58-1c5764d5dca2" (UID: "a2d915c2-0d65-4a45-9f58-1c5764d5dca2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.494744 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a2d915c2-0d65-4a45-9f58-1c5764d5dca2" (UID: "a2d915c2-0d65-4a45-9f58-1c5764d5dca2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.495707 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a2d915c2-0d65-4a45-9f58-1c5764d5dca2" (UID: "a2d915c2-0d65-4a45-9f58-1c5764d5dca2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.510023 4974 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.510071 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.510090 4974 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.510107 4974 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-dns-svc\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.510123 4974 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.510138 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6dgd\" (UniqueName: \"kubernetes.io/projected/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-kube-api-access-x6dgd\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.510154 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2d915c2-0d65-4a45-9f58-1c5764d5dca2-config\") on node \"crc\" DevicePath \"\"" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.612502 4974 generic.go:334] "Generic (PLEG): container finished" podID="a2d915c2-0d65-4a45-9f58-1c5764d5dca2" containerID="7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc" exitCode=0 Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.612622 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.612695 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" event={"ID":"a2d915c2-0d65-4a45-9f58-1c5764d5dca2","Type":"ContainerDied","Data":"7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc"} Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.617549 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-845969cbff-sk7vl" event={"ID":"a2d915c2-0d65-4a45-9f58-1c5764d5dca2","Type":"ContainerDied","Data":"2280b70f6573499157d095334c5f4d87c56a031720dfce4fe9c153b36175ac6c"} Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.617592 4974 scope.go:117] "RemoveContainer" containerID="7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.643494 4974 scope.go:117] "RemoveContainer" containerID="96e037399ac33828f113226f332c98bacd19fa0e3bdd4222792ee48e0872c183" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.660973 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-845969cbff-sk7vl"] Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.670295 4974 scope.go:117] "RemoveContainer" containerID="7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc" Oct 13 18:35:48 crc kubenswrapper[4974]: E1013 18:35:48.670824 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc\": container with ID starting with 7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc not found: ID does not exist" containerID="7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.670854 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc"} err="failed to get container status \"7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc\": rpc error: code = NotFound desc = could not find container \"7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc\": container with ID starting with 7f2e8394df417b5f14e05c8daec80003137d9e107fab734f2f24a95e9fbe40dc not found: ID does not exist" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.670874 4974 scope.go:117] "RemoveContainer" containerID="96e037399ac33828f113226f332c98bacd19fa0e3bdd4222792ee48e0872c183" Oct 13 18:35:48 crc kubenswrapper[4974]: E1013 18:35:48.671203 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96e037399ac33828f113226f332c98bacd19fa0e3bdd4222792ee48e0872c183\": container with ID starting with 96e037399ac33828f113226f332c98bacd19fa0e3bdd4222792ee48e0872c183 not found: ID does not exist" containerID="96e037399ac33828f113226f332c98bacd19fa0e3bdd4222792ee48e0872c183" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.671232 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96e037399ac33828f113226f332c98bacd19fa0e3bdd4222792ee48e0872c183"} err="failed to get container status \"96e037399ac33828f113226f332c98bacd19fa0e3bdd4222792ee48e0872c183\": rpc error: code = NotFound desc = could not find container \"96e037399ac33828f113226f332c98bacd19fa0e3bdd4222792ee48e0872c183\": container with ID starting with 96e037399ac33828f113226f332c98bacd19fa0e3bdd4222792ee48e0872c183 not found: ID does not exist" Oct 13 18:35:48 crc kubenswrapper[4974]: I1013 18:35:48.673323 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-845969cbff-sk7vl"] Oct 13 18:35:49 crc kubenswrapper[4974]: I1013 18:35:49.830978 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d915c2-0d65-4a45-9f58-1c5764d5dca2" path="/var/lib/kubelet/pods/a2d915c2-0d65-4a45-9f58-1c5764d5dca2/volumes" Oct 13 18:35:54 crc kubenswrapper[4974]: I1013 18:35:54.709574 4974 generic.go:334] "Generic (PLEG): container finished" podID="4bf0b2fe-061e-486f-9e0f-96bd13bc7eae" containerID="a678c515069869aea47392e1cc130858a79988a69b7730d9980d5a7bff1a9cd6" exitCode=0 Oct 13 18:35:54 crc kubenswrapper[4974]: I1013 18:35:54.710083 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae","Type":"ContainerDied","Data":"a678c515069869aea47392e1cc130858a79988a69b7730d9980d5a7bff1a9cd6"} Oct 13 18:35:54 crc kubenswrapper[4974]: I1013 18:35:54.722212 4974 generic.go:334] "Generic (PLEG): container finished" podID="4b2e987b-fd90-420d-86f1-b9757dd40b03" containerID="7dd35c40f33001c947258a42e9e7856ee294a6c46ec8626732702ccb351c57c0" exitCode=0 Oct 13 18:35:54 crc kubenswrapper[4974]: I1013 18:35:54.722265 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4b2e987b-fd90-420d-86f1-b9757dd40b03","Type":"ContainerDied","Data":"7dd35c40f33001c947258a42e9e7856ee294a6c46ec8626732702ccb351c57c0"} Oct 13 18:35:55 crc kubenswrapper[4974]: I1013 18:35:55.766710 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4bf0b2fe-061e-486f-9e0f-96bd13bc7eae","Type":"ContainerStarted","Data":"a6945ac94647359d120e005eba9584c6c4e8f01ae496e6d73e29a31c1e1b50ac"} Oct 13 18:35:55 crc kubenswrapper[4974]: I1013 18:35:55.767736 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:35:55 crc kubenswrapper[4974]: I1013 18:35:55.768882 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4b2e987b-fd90-420d-86f1-b9757dd40b03","Type":"ContainerStarted","Data":"e9ec334ce576f4061438e9976a5c9c2c1bedd81a3b46041c40c35b61debd8ceb"} Oct 13 18:35:55 crc kubenswrapper[4974]: I1013 18:35:55.775977 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Oct 13 18:35:55 crc kubenswrapper[4974]: I1013 18:35:55.810096 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.81006758 podStartE2EDuration="37.81006758s" podCreationTimestamp="2025-10-13 18:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:35:55.803022814 +0000 UTC m=+1290.707388904" watchObservedRunningTime="2025-10-13 18:35:55.81006758 +0000 UTC m=+1290.714433680" Oct 13 18:35:55 crc kubenswrapper[4974]: I1013 18:35:55.842623 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.842605085 podStartE2EDuration="37.842605085s" podCreationTimestamp="2025-10-13 18:35:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 18:35:55.830766936 +0000 UTC m=+1290.735133016" watchObservedRunningTime="2025-10-13 18:35:55.842605085 +0000 UTC m=+1290.746971175" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.475589 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx"] Oct 13 18:36:01 crc kubenswrapper[4974]: E1013 18:36:01.476681 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c00496a5-8aaa-48e0-a156-31a35d2299bf" containerName="init" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.476702 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c00496a5-8aaa-48e0-a156-31a35d2299bf" containerName="init" Oct 13 18:36:01 crc kubenswrapper[4974]: E1013 18:36:01.476720 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d915c2-0d65-4a45-9f58-1c5764d5dca2" containerName="init" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.476728 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d915c2-0d65-4a45-9f58-1c5764d5dca2" containerName="init" Oct 13 18:36:01 crc kubenswrapper[4974]: E1013 18:36:01.476752 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c00496a5-8aaa-48e0-a156-31a35d2299bf" containerName="dnsmasq-dns" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.476762 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c00496a5-8aaa-48e0-a156-31a35d2299bf" containerName="dnsmasq-dns" Oct 13 18:36:01 crc kubenswrapper[4974]: E1013 18:36:01.476785 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d915c2-0d65-4a45-9f58-1c5764d5dca2" containerName="dnsmasq-dns" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.476793 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d915c2-0d65-4a45-9f58-1c5764d5dca2" containerName="dnsmasq-dns" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.477049 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d915c2-0d65-4a45-9f58-1c5764d5dca2" containerName="dnsmasq-dns" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.477086 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="c00496a5-8aaa-48e0-a156-31a35d2299bf" containerName="dnsmasq-dns" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.477912 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.480346 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.480578 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.481569 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.489195 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.495949 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx"] Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.615810 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.616094 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.616240 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxrh9\" (UniqueName: \"kubernetes.io/projected/3d236165-9044-430d-92cf-33e4eadd281f-kube-api-access-qxrh9\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.616346 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.718316 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxrh9\" (UniqueName: \"kubernetes.io/projected/3d236165-9044-430d-92cf-33e4eadd281f-kube-api-access-qxrh9\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.718395 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.718471 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.718562 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.730451 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.730956 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.731422 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.738815 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxrh9\" (UniqueName: \"kubernetes.io/projected/3d236165-9044-430d-92cf-33e4eadd281f-kube-api-access-qxrh9\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:01 crc kubenswrapper[4974]: I1013 18:36:01.812506 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:02 crc kubenswrapper[4974]: I1013 18:36:02.463170 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx"] Oct 13 18:36:02 crc kubenswrapper[4974]: I1013 18:36:02.850591 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" event={"ID":"3d236165-9044-430d-92cf-33e4eadd281f","Type":"ContainerStarted","Data":"2a867b16e282dd5be7511c6acdb132c61c099001f9a8c64415916d9a5ecad0ba"} Oct 13 18:36:08 crc kubenswrapper[4974]: I1013 18:36:08.802872 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Oct 13 18:36:09 crc kubenswrapper[4974]: I1013 18:36:09.088002 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Oct 13 18:36:11 crc kubenswrapper[4974]: I1013 18:36:11.956486 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" event={"ID":"3d236165-9044-430d-92cf-33e4eadd281f","Type":"ContainerStarted","Data":"6edc8efdd2ef70964b2994f87c9594db50dffdbc23107cd3902642bac77d40a7"} Oct 13 18:36:11 crc kubenswrapper[4974]: I1013 18:36:11.984091 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" podStartSLOduration=2.222029826 podStartE2EDuration="10.984067772s" podCreationTimestamp="2025-10-13 18:36:01 +0000 UTC" firstStartedPulling="2025-10-13 18:36:02.467197694 +0000 UTC m=+1297.371563784" lastFinishedPulling="2025-10-13 18:36:11.22923565 +0000 UTC m=+1306.133601730" observedRunningTime="2025-10-13 18:36:11.973077273 +0000 UTC m=+1306.877443353" watchObservedRunningTime="2025-10-13 18:36:11.984067772 +0000 UTC m=+1306.888433872" Oct 13 18:36:24 crc kubenswrapper[4974]: I1013 18:36:24.115150 4974 generic.go:334] "Generic (PLEG): container finished" podID="3d236165-9044-430d-92cf-33e4eadd281f" containerID="6edc8efdd2ef70964b2994f87c9594db50dffdbc23107cd3902642bac77d40a7" exitCode=0 Oct 13 18:36:24 crc kubenswrapper[4974]: I1013 18:36:24.115280 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" event={"ID":"3d236165-9044-430d-92cf-33e4eadd281f","Type":"ContainerDied","Data":"6edc8efdd2ef70964b2994f87c9594db50dffdbc23107cd3902642bac77d40a7"} Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.669146 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.796883 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxrh9\" (UniqueName: \"kubernetes.io/projected/3d236165-9044-430d-92cf-33e4eadd281f-kube-api-access-qxrh9\") pod \"3d236165-9044-430d-92cf-33e4eadd281f\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.797074 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-inventory\") pod \"3d236165-9044-430d-92cf-33e4eadd281f\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.797148 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-repo-setup-combined-ca-bundle\") pod \"3d236165-9044-430d-92cf-33e4eadd281f\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.797316 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-ssh-key\") pod \"3d236165-9044-430d-92cf-33e4eadd281f\" (UID: \"3d236165-9044-430d-92cf-33e4eadd281f\") " Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.802301 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d236165-9044-430d-92cf-33e4eadd281f-kube-api-access-qxrh9" (OuterVolumeSpecName: "kube-api-access-qxrh9") pod "3d236165-9044-430d-92cf-33e4eadd281f" (UID: "3d236165-9044-430d-92cf-33e4eadd281f"). InnerVolumeSpecName "kube-api-access-qxrh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.806847 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3d236165-9044-430d-92cf-33e4eadd281f" (UID: "3d236165-9044-430d-92cf-33e4eadd281f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.826758 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-inventory" (OuterVolumeSpecName: "inventory") pod "3d236165-9044-430d-92cf-33e4eadd281f" (UID: "3d236165-9044-430d-92cf-33e4eadd281f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.863783 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "3d236165-9044-430d-92cf-33e4eadd281f" (UID: "3d236165-9044-430d-92cf-33e4eadd281f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.899119 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxrh9\" (UniqueName: \"kubernetes.io/projected/3d236165-9044-430d-92cf-33e4eadd281f-kube-api-access-qxrh9\") on node \"crc\" DevicePath \"\"" Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.899159 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.899176 4974 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:36:25 crc kubenswrapper[4974]: I1013 18:36:25.899188 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/3d236165-9044-430d-92cf-33e4eadd281f-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.145376 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" event={"ID":"3d236165-9044-430d-92cf-33e4eadd281f","Type":"ContainerDied","Data":"2a867b16e282dd5be7511c6acdb132c61c099001f9a8c64415916d9a5ecad0ba"} Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.145436 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a867b16e282dd5be7511c6acdb132c61c099001f9a8c64415916d9a5ecad0ba" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.145497 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.273905 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw"] Oct 13 18:36:26 crc kubenswrapper[4974]: E1013 18:36:26.275189 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d236165-9044-430d-92cf-33e4eadd281f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.275429 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d236165-9044-430d-92cf-33e4eadd281f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.276210 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d236165-9044-430d-92cf-33e4eadd281f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.277607 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.279943 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.281548 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.282070 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.282277 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.292012 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw"] Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.308153 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6b26b6-df93-48fd-bbec-18aa5a371db8-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fg6xw\" (UID: \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.308407 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbcmw\" (UniqueName: \"kubernetes.io/projected/4f6b26b6-df93-48fd-bbec-18aa5a371db8-kube-api-access-kbcmw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fg6xw\" (UID: \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.308570 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6b26b6-df93-48fd-bbec-18aa5a371db8-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fg6xw\" (UID: \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.411240 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6b26b6-df93-48fd-bbec-18aa5a371db8-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fg6xw\" (UID: \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.411740 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbcmw\" (UniqueName: \"kubernetes.io/projected/4f6b26b6-df93-48fd-bbec-18aa5a371db8-kube-api-access-kbcmw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fg6xw\" (UID: \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.411814 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6b26b6-df93-48fd-bbec-18aa5a371db8-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fg6xw\" (UID: \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.416539 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6b26b6-df93-48fd-bbec-18aa5a371db8-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fg6xw\" (UID: \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.416840 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6b26b6-df93-48fd-bbec-18aa5a371db8-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fg6xw\" (UID: \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.436733 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbcmw\" (UniqueName: \"kubernetes.io/projected/4f6b26b6-df93-48fd-bbec-18aa5a371db8-kube-api-access-kbcmw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-fg6xw\" (UID: \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:26 crc kubenswrapper[4974]: I1013 18:36:26.607232 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:27 crc kubenswrapper[4974]: W1013 18:36:27.247719 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f6b26b6_df93_48fd_bbec_18aa5a371db8.slice/crio-a725dbbdc6f96a90ff6c11b535771328795d1382ce44f252bcb049d7053eeaff WatchSource:0}: Error finding container a725dbbdc6f96a90ff6c11b535771328795d1382ce44f252bcb049d7053eeaff: Status 404 returned error can't find the container with id a725dbbdc6f96a90ff6c11b535771328795d1382ce44f252bcb049d7053eeaff Oct 13 18:36:27 crc kubenswrapper[4974]: I1013 18:36:27.257727 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw"] Oct 13 18:36:28 crc kubenswrapper[4974]: I1013 18:36:28.177226 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" event={"ID":"4f6b26b6-df93-48fd-bbec-18aa5a371db8","Type":"ContainerStarted","Data":"969a43a60baea60bf674ee45c1296c47a860b17c0f33a9faf2c7fde8fce4b793"} Oct 13 18:36:28 crc kubenswrapper[4974]: I1013 18:36:28.177578 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" event={"ID":"4f6b26b6-df93-48fd-bbec-18aa5a371db8","Type":"ContainerStarted","Data":"a725dbbdc6f96a90ff6c11b535771328795d1382ce44f252bcb049d7053eeaff"} Oct 13 18:36:28 crc kubenswrapper[4974]: I1013 18:36:28.206490 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" podStartSLOduration=1.73494883 podStartE2EDuration="2.206467687s" podCreationTimestamp="2025-10-13 18:36:26 +0000 UTC" firstStartedPulling="2025-10-13 18:36:27.251175998 +0000 UTC m=+1322.155542088" lastFinishedPulling="2025-10-13 18:36:27.722694825 +0000 UTC m=+1322.627060945" observedRunningTime="2025-10-13 18:36:28.20124905 +0000 UTC m=+1323.105615190" watchObservedRunningTime="2025-10-13 18:36:28.206467687 +0000 UTC m=+1323.110833777" Oct 13 18:36:31 crc kubenswrapper[4974]: I1013 18:36:31.227753 4974 generic.go:334] "Generic (PLEG): container finished" podID="4f6b26b6-df93-48fd-bbec-18aa5a371db8" containerID="969a43a60baea60bf674ee45c1296c47a860b17c0f33a9faf2c7fde8fce4b793" exitCode=0 Oct 13 18:36:31 crc kubenswrapper[4974]: I1013 18:36:31.228363 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" event={"ID":"4f6b26b6-df93-48fd-bbec-18aa5a371db8","Type":"ContainerDied","Data":"969a43a60baea60bf674ee45c1296c47a860b17c0f33a9faf2c7fde8fce4b793"} Oct 13 18:36:32 crc kubenswrapper[4974]: I1013 18:36:32.727479 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:32 crc kubenswrapper[4974]: I1013 18:36:32.751676 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbcmw\" (UniqueName: \"kubernetes.io/projected/4f6b26b6-df93-48fd-bbec-18aa5a371db8-kube-api-access-kbcmw\") pod \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\" (UID: \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\") " Oct 13 18:36:32 crc kubenswrapper[4974]: I1013 18:36:32.751731 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6b26b6-df93-48fd-bbec-18aa5a371db8-ssh-key\") pod \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\" (UID: \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\") " Oct 13 18:36:32 crc kubenswrapper[4974]: I1013 18:36:32.751784 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6b26b6-df93-48fd-bbec-18aa5a371db8-inventory\") pod \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\" (UID: \"4f6b26b6-df93-48fd-bbec-18aa5a371db8\") " Oct 13 18:36:32 crc kubenswrapper[4974]: I1013 18:36:32.759727 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f6b26b6-df93-48fd-bbec-18aa5a371db8-kube-api-access-kbcmw" (OuterVolumeSpecName: "kube-api-access-kbcmw") pod "4f6b26b6-df93-48fd-bbec-18aa5a371db8" (UID: "4f6b26b6-df93-48fd-bbec-18aa5a371db8"). InnerVolumeSpecName "kube-api-access-kbcmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:36:32 crc kubenswrapper[4974]: I1013 18:36:32.781888 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6b26b6-df93-48fd-bbec-18aa5a371db8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4f6b26b6-df93-48fd-bbec-18aa5a371db8" (UID: "4f6b26b6-df93-48fd-bbec-18aa5a371db8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:36:32 crc kubenswrapper[4974]: I1013 18:36:32.803794 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f6b26b6-df93-48fd-bbec-18aa5a371db8-inventory" (OuterVolumeSpecName: "inventory") pod "4f6b26b6-df93-48fd-bbec-18aa5a371db8" (UID: "4f6b26b6-df93-48fd-bbec-18aa5a371db8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:36:32 crc kubenswrapper[4974]: I1013 18:36:32.855094 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbcmw\" (UniqueName: \"kubernetes.io/projected/4f6b26b6-df93-48fd-bbec-18aa5a371db8-kube-api-access-kbcmw\") on node \"crc\" DevicePath \"\"" Oct 13 18:36:32 crc kubenswrapper[4974]: I1013 18:36:32.855145 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f6b26b6-df93-48fd-bbec-18aa5a371db8-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:36:32 crc kubenswrapper[4974]: I1013 18:36:32.855163 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f6b26b6-df93-48fd-bbec-18aa5a371db8-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.260114 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" event={"ID":"4f6b26b6-df93-48fd-bbec-18aa5a371db8","Type":"ContainerDied","Data":"a725dbbdc6f96a90ff6c11b535771328795d1382ce44f252bcb049d7053eeaff"} Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.260172 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a725dbbdc6f96a90ff6c11b535771328795d1382ce44f252bcb049d7053eeaff" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.260190 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-fg6xw" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.448178 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg"] Oct 13 18:36:33 crc kubenswrapper[4974]: E1013 18:36:33.449045 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f6b26b6-df93-48fd-bbec-18aa5a371db8" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.449111 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f6b26b6-df93-48fd-bbec-18aa5a371db8" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.449400 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f6b26b6-df93-48fd-bbec-18aa5a371db8" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.450358 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.453233 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.454246 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.454435 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.458033 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.468924 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.469064 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxx4d\" (UniqueName: \"kubernetes.io/projected/684a8cdf-df17-41a7-87b8-9027cb982025-kube-api-access-qxx4d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.469130 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.469197 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.472899 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg"] Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.570010 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.570168 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxx4d\" (UniqueName: \"kubernetes.io/projected/684a8cdf-df17-41a7-87b8-9027cb982025-kube-api-access-qxx4d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.570218 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.570274 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.577481 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.581321 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.587587 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.591716 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxx4d\" (UniqueName: \"kubernetes.io/projected/684a8cdf-df17-41a7-87b8-9027cb982025-kube-api-access-qxx4d\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:33 crc kubenswrapper[4974]: I1013 18:36:33.776313 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:36:34 crc kubenswrapper[4974]: I1013 18:36:34.331119 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg"] Oct 13 18:36:34 crc kubenswrapper[4974]: W1013 18:36:34.334283 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod684a8cdf_df17_41a7_87b8_9027cb982025.slice/crio-cfc0f62d2be9821c176ab9b13f51433283d6f59c30531bf0a4589c4a86a45d23 WatchSource:0}: Error finding container cfc0f62d2be9821c176ab9b13f51433283d6f59c30531bf0a4589c4a86a45d23: Status 404 returned error can't find the container with id cfc0f62d2be9821c176ab9b13f51433283d6f59c30531bf0a4589c4a86a45d23 Oct 13 18:36:35 crc kubenswrapper[4974]: I1013 18:36:35.282972 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" event={"ID":"684a8cdf-df17-41a7-87b8-9027cb982025","Type":"ContainerStarted","Data":"43aa7a036a9dff60ec4567895a41b1d2fc39d5f5f11ce21efa621057fbba8b85"} Oct 13 18:36:35 crc kubenswrapper[4974]: I1013 18:36:35.283803 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" event={"ID":"684a8cdf-df17-41a7-87b8-9027cb982025","Type":"ContainerStarted","Data":"cfc0f62d2be9821c176ab9b13f51433283d6f59c30531bf0a4589c4a86a45d23"} Oct 13 18:36:35 crc kubenswrapper[4974]: I1013 18:36:35.314264 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" podStartSLOduration=1.79701512 podStartE2EDuration="2.314238542s" podCreationTimestamp="2025-10-13 18:36:33 +0000 UTC" firstStartedPulling="2025-10-13 18:36:34.337019057 +0000 UTC m=+1329.241385137" lastFinishedPulling="2025-10-13 18:36:34.854242459 +0000 UTC m=+1329.758608559" observedRunningTime="2025-10-13 18:36:35.311510665 +0000 UTC m=+1330.215876765" watchObservedRunningTime="2025-10-13 18:36:35.314238542 +0000 UTC m=+1330.218604652" Oct 13 18:36:46 crc kubenswrapper[4974]: I1013 18:36:46.110401 4974 scope.go:117] "RemoveContainer" containerID="7ed3a4b63164c74e011c06c8a38b884cbac99cd9f55e21363135f25bcacd086a" Oct 13 18:36:46 crc kubenswrapper[4974]: I1013 18:36:46.157817 4974 scope.go:117] "RemoveContainer" containerID="e13e34ba0f4e9f55479f04a643e78140f9612282cf82aee579f4ccd1b24af1eb" Oct 13 18:36:46 crc kubenswrapper[4974]: I1013 18:36:46.245912 4974 scope.go:117] "RemoveContainer" containerID="ca65f504f908623cde173873fab57be00d92d9ce2b70f357fc3564a84a2d51a8" Oct 13 18:36:46 crc kubenswrapper[4974]: I1013 18:36:46.315193 4974 scope.go:117] "RemoveContainer" containerID="b2b9cfeee6089bf9ad2b4ad83cff4fd85575fca3cbee58b630a2b257678224ed" Oct 13 18:36:46 crc kubenswrapper[4974]: I1013 18:36:46.347320 4974 scope.go:117] "RemoveContainer" containerID="5544ce163a08127e1c9ef75f7b429c575d7a7198e2bf33873fa4cb6fc3f3a050" Oct 13 18:37:46 crc kubenswrapper[4974]: I1013 18:37:46.533531 4974 scope.go:117] "RemoveContainer" containerID="4ae96bfa993eb5985f9b00374da0d3249ad1c982eea46fc5e9e966f77ec5a5c3" Oct 13 18:38:07 crc kubenswrapper[4974]: I1013 18:38:07.742992 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:38:07 crc kubenswrapper[4974]: I1013 18:38:07.743635 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.511369 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-j6q8m"] Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.517295 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.546147 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6q8m"] Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.619584 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67b1c08d-f1a5-445a-8bce-8d1de6b37483-utilities\") pod \"community-operators-j6q8m\" (UID: \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\") " pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.619919 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpklf\" (UniqueName: \"kubernetes.io/projected/67b1c08d-f1a5-445a-8bce-8d1de6b37483-kube-api-access-mpklf\") pod \"community-operators-j6q8m\" (UID: \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\") " pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.620063 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67b1c08d-f1a5-445a-8bce-8d1de6b37483-catalog-content\") pod \"community-operators-j6q8m\" (UID: \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\") " pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.721440 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67b1c08d-f1a5-445a-8bce-8d1de6b37483-utilities\") pod \"community-operators-j6q8m\" (UID: \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\") " pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.721849 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpklf\" (UniqueName: \"kubernetes.io/projected/67b1c08d-f1a5-445a-8bce-8d1de6b37483-kube-api-access-mpklf\") pod \"community-operators-j6q8m\" (UID: \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\") " pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.722000 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67b1c08d-f1a5-445a-8bce-8d1de6b37483-catalog-content\") pod \"community-operators-j6q8m\" (UID: \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\") " pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.722013 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67b1c08d-f1a5-445a-8bce-8d1de6b37483-utilities\") pod \"community-operators-j6q8m\" (UID: \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\") " pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.722280 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67b1c08d-f1a5-445a-8bce-8d1de6b37483-catalog-content\") pod \"community-operators-j6q8m\" (UID: \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\") " pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.741035 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpklf\" (UniqueName: \"kubernetes.io/projected/67b1c08d-f1a5-445a-8bce-8d1de6b37483-kube-api-access-mpklf\") pod \"community-operators-j6q8m\" (UID: \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\") " pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:29 crc kubenswrapper[4974]: I1013 18:38:29.861707 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:30 crc kubenswrapper[4974]: W1013 18:38:30.433020 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67b1c08d_f1a5_445a_8bce_8d1de6b37483.slice/crio-2f2e17cb4fec469b8e6bf2745b53faf6941dac84e75c23f94ee31c9a71fffe3d WatchSource:0}: Error finding container 2f2e17cb4fec469b8e6bf2745b53faf6941dac84e75c23f94ee31c9a71fffe3d: Status 404 returned error can't find the container with id 2f2e17cb4fec469b8e6bf2745b53faf6941dac84e75c23f94ee31c9a71fffe3d Oct 13 18:38:30 crc kubenswrapper[4974]: I1013 18:38:30.444579 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-j6q8m"] Oct 13 18:38:30 crc kubenswrapper[4974]: I1013 18:38:30.820400 4974 generic.go:334] "Generic (PLEG): container finished" podID="67b1c08d-f1a5-445a-8bce-8d1de6b37483" containerID="7147d751353ea089db75b2a9599c28e2e9499b62155b10dba0c3222b80e28c51" exitCode=0 Oct 13 18:38:30 crc kubenswrapper[4974]: I1013 18:38:30.820621 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6q8m" event={"ID":"67b1c08d-f1a5-445a-8bce-8d1de6b37483","Type":"ContainerDied","Data":"7147d751353ea089db75b2a9599c28e2e9499b62155b10dba0c3222b80e28c51"} Oct 13 18:38:30 crc kubenswrapper[4974]: I1013 18:38:30.820640 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6q8m" event={"ID":"67b1c08d-f1a5-445a-8bce-8d1de6b37483","Type":"ContainerStarted","Data":"2f2e17cb4fec469b8e6bf2745b53faf6941dac84e75c23f94ee31c9a71fffe3d"} Oct 13 18:38:32 crc kubenswrapper[4974]: I1013 18:38:32.864328 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6q8m" event={"ID":"67b1c08d-f1a5-445a-8bce-8d1de6b37483","Type":"ContainerStarted","Data":"93852cd38575e4de07942aafb82379fa7950eb1b15a8444e6769d06513fd7201"} Oct 13 18:38:33 crc kubenswrapper[4974]: I1013 18:38:33.880019 4974 generic.go:334] "Generic (PLEG): container finished" podID="67b1c08d-f1a5-445a-8bce-8d1de6b37483" containerID="93852cd38575e4de07942aafb82379fa7950eb1b15a8444e6769d06513fd7201" exitCode=0 Oct 13 18:38:33 crc kubenswrapper[4974]: I1013 18:38:33.880085 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6q8m" event={"ID":"67b1c08d-f1a5-445a-8bce-8d1de6b37483","Type":"ContainerDied","Data":"93852cd38575e4de07942aafb82379fa7950eb1b15a8444e6769d06513fd7201"} Oct 13 18:38:34 crc kubenswrapper[4974]: I1013 18:38:34.895502 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6q8m" event={"ID":"67b1c08d-f1a5-445a-8bce-8d1de6b37483","Type":"ContainerStarted","Data":"68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3"} Oct 13 18:38:34 crc kubenswrapper[4974]: I1013 18:38:34.923207 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-j6q8m" podStartSLOduration=2.143640915 podStartE2EDuration="5.923176275s" podCreationTimestamp="2025-10-13 18:38:29 +0000 UTC" firstStartedPulling="2025-10-13 18:38:30.821865684 +0000 UTC m=+1445.726231804" lastFinishedPulling="2025-10-13 18:38:34.601401034 +0000 UTC m=+1449.505767164" observedRunningTime="2025-10-13 18:38:34.916841497 +0000 UTC m=+1449.821207597" watchObservedRunningTime="2025-10-13 18:38:34.923176275 +0000 UTC m=+1449.827542395" Oct 13 18:38:37 crc kubenswrapper[4974]: I1013 18:38:37.743323 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:38:37 crc kubenswrapper[4974]: I1013 18:38:37.743828 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:38:39 crc kubenswrapper[4974]: I1013 18:38:39.861973 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:39 crc kubenswrapper[4974]: I1013 18:38:39.862384 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:39 crc kubenswrapper[4974]: I1013 18:38:39.937580 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:40 crc kubenswrapper[4974]: I1013 18:38:40.023691 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:40 crc kubenswrapper[4974]: I1013 18:38:40.186383 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6q8m"] Oct 13 18:38:41 crc kubenswrapper[4974]: I1013 18:38:41.978144 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-j6q8m" podUID="67b1c08d-f1a5-445a-8bce-8d1de6b37483" containerName="registry-server" containerID="cri-o://68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3" gracePeriod=2 Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.452844 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.604967 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67b1c08d-f1a5-445a-8bce-8d1de6b37483-utilities\") pod \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\" (UID: \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\") " Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.605015 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpklf\" (UniqueName: \"kubernetes.io/projected/67b1c08d-f1a5-445a-8bce-8d1de6b37483-kube-api-access-mpklf\") pod \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\" (UID: \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\") " Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.605087 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67b1c08d-f1a5-445a-8bce-8d1de6b37483-catalog-content\") pod \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\" (UID: \"67b1c08d-f1a5-445a-8bce-8d1de6b37483\") " Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.606832 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67b1c08d-f1a5-445a-8bce-8d1de6b37483-utilities" (OuterVolumeSpecName: "utilities") pod "67b1c08d-f1a5-445a-8bce-8d1de6b37483" (UID: "67b1c08d-f1a5-445a-8bce-8d1de6b37483"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.622589 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67b1c08d-f1a5-445a-8bce-8d1de6b37483-kube-api-access-mpklf" (OuterVolumeSpecName: "kube-api-access-mpklf") pod "67b1c08d-f1a5-445a-8bce-8d1de6b37483" (UID: "67b1c08d-f1a5-445a-8bce-8d1de6b37483"). InnerVolumeSpecName "kube-api-access-mpklf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.680031 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67b1c08d-f1a5-445a-8bce-8d1de6b37483-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "67b1c08d-f1a5-445a-8bce-8d1de6b37483" (UID: "67b1c08d-f1a5-445a-8bce-8d1de6b37483"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.708472 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/67b1c08d-f1a5-445a-8bce-8d1de6b37483-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.708521 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpklf\" (UniqueName: \"kubernetes.io/projected/67b1c08d-f1a5-445a-8bce-8d1de6b37483-kube-api-access-mpklf\") on node \"crc\" DevicePath \"\"" Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.708532 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/67b1c08d-f1a5-445a-8bce-8d1de6b37483-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.993377 4974 generic.go:334] "Generic (PLEG): container finished" podID="67b1c08d-f1a5-445a-8bce-8d1de6b37483" containerID="68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3" exitCode=0 Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.993435 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6q8m" event={"ID":"67b1c08d-f1a5-445a-8bce-8d1de6b37483","Type":"ContainerDied","Data":"68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3"} Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.993441 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-j6q8m" Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.993487 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-j6q8m" event={"ID":"67b1c08d-f1a5-445a-8bce-8d1de6b37483","Type":"ContainerDied","Data":"2f2e17cb4fec469b8e6bf2745b53faf6941dac84e75c23f94ee31c9a71fffe3d"} Oct 13 18:38:42 crc kubenswrapper[4974]: I1013 18:38:42.993521 4974 scope.go:117] "RemoveContainer" containerID="68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3" Oct 13 18:38:43 crc kubenswrapper[4974]: I1013 18:38:43.028941 4974 scope.go:117] "RemoveContainer" containerID="93852cd38575e4de07942aafb82379fa7950eb1b15a8444e6769d06513fd7201" Oct 13 18:38:43 crc kubenswrapper[4974]: I1013 18:38:43.033912 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-j6q8m"] Oct 13 18:38:43 crc kubenswrapper[4974]: I1013 18:38:43.042835 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-j6q8m"] Oct 13 18:38:43 crc kubenswrapper[4974]: I1013 18:38:43.084730 4974 scope.go:117] "RemoveContainer" containerID="7147d751353ea089db75b2a9599c28e2e9499b62155b10dba0c3222b80e28c51" Oct 13 18:38:43 crc kubenswrapper[4974]: I1013 18:38:43.102812 4974 scope.go:117] "RemoveContainer" containerID="68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3" Oct 13 18:38:43 crc kubenswrapper[4974]: E1013 18:38:43.103220 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3\": container with ID starting with 68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3 not found: ID does not exist" containerID="68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3" Oct 13 18:38:43 crc kubenswrapper[4974]: I1013 18:38:43.103251 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3"} err="failed to get container status \"68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3\": rpc error: code = NotFound desc = could not find container \"68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3\": container with ID starting with 68db0040589a286634a22365682013be9edabd4aa627396d614350ad8b912cd3 not found: ID does not exist" Oct 13 18:38:43 crc kubenswrapper[4974]: I1013 18:38:43.103274 4974 scope.go:117] "RemoveContainer" containerID="93852cd38575e4de07942aafb82379fa7950eb1b15a8444e6769d06513fd7201" Oct 13 18:38:43 crc kubenswrapper[4974]: E1013 18:38:43.103524 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93852cd38575e4de07942aafb82379fa7950eb1b15a8444e6769d06513fd7201\": container with ID starting with 93852cd38575e4de07942aafb82379fa7950eb1b15a8444e6769d06513fd7201 not found: ID does not exist" containerID="93852cd38575e4de07942aafb82379fa7950eb1b15a8444e6769d06513fd7201" Oct 13 18:38:43 crc kubenswrapper[4974]: I1013 18:38:43.103555 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93852cd38575e4de07942aafb82379fa7950eb1b15a8444e6769d06513fd7201"} err="failed to get container status \"93852cd38575e4de07942aafb82379fa7950eb1b15a8444e6769d06513fd7201\": rpc error: code = NotFound desc = could not find container \"93852cd38575e4de07942aafb82379fa7950eb1b15a8444e6769d06513fd7201\": container with ID starting with 93852cd38575e4de07942aafb82379fa7950eb1b15a8444e6769d06513fd7201 not found: ID does not exist" Oct 13 18:38:43 crc kubenswrapper[4974]: I1013 18:38:43.103570 4974 scope.go:117] "RemoveContainer" containerID="7147d751353ea089db75b2a9599c28e2e9499b62155b10dba0c3222b80e28c51" Oct 13 18:38:43 crc kubenswrapper[4974]: E1013 18:38:43.103787 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7147d751353ea089db75b2a9599c28e2e9499b62155b10dba0c3222b80e28c51\": container with ID starting with 7147d751353ea089db75b2a9599c28e2e9499b62155b10dba0c3222b80e28c51 not found: ID does not exist" containerID="7147d751353ea089db75b2a9599c28e2e9499b62155b10dba0c3222b80e28c51" Oct 13 18:38:43 crc kubenswrapper[4974]: I1013 18:38:43.103811 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7147d751353ea089db75b2a9599c28e2e9499b62155b10dba0c3222b80e28c51"} err="failed to get container status \"7147d751353ea089db75b2a9599c28e2e9499b62155b10dba0c3222b80e28c51\": rpc error: code = NotFound desc = could not find container \"7147d751353ea089db75b2a9599c28e2e9499b62155b10dba0c3222b80e28c51\": container with ID starting with 7147d751353ea089db75b2a9599c28e2e9499b62155b10dba0c3222b80e28c51 not found: ID does not exist" Oct 13 18:38:43 crc kubenswrapper[4974]: I1013 18:38:43.824963 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67b1c08d-f1a5-445a-8bce-8d1de6b37483" path="/var/lib/kubelet/pods/67b1c08d-f1a5-445a-8bce-8d1de6b37483/volumes" Oct 13 18:38:46 crc kubenswrapper[4974]: I1013 18:38:46.660706 4974 scope.go:117] "RemoveContainer" containerID="4add1ee5b173309b2d755f30179584f4be4f02a53c48e99b6c034d6894024840" Oct 13 18:38:46 crc kubenswrapper[4974]: I1013 18:38:46.705298 4974 scope.go:117] "RemoveContainer" containerID="36fe83fb2a3be1c32d1dbc57d613e58f31a369a15a541a17891b27354a1de5e1" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.129823 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-snjmp"] Oct 13 18:38:48 crc kubenswrapper[4974]: E1013 18:38:48.130869 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b1c08d-f1a5-445a-8bce-8d1de6b37483" containerName="extract-utilities" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.130892 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b1c08d-f1a5-445a-8bce-8d1de6b37483" containerName="extract-utilities" Oct 13 18:38:48 crc kubenswrapper[4974]: E1013 18:38:48.130919 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b1c08d-f1a5-445a-8bce-8d1de6b37483" containerName="registry-server" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.130934 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b1c08d-f1a5-445a-8bce-8d1de6b37483" containerName="registry-server" Oct 13 18:38:48 crc kubenswrapper[4974]: E1013 18:38:48.130967 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b1c08d-f1a5-445a-8bce-8d1de6b37483" containerName="extract-content" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.130982 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b1c08d-f1a5-445a-8bce-8d1de6b37483" containerName="extract-content" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.131350 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="67b1c08d-f1a5-445a-8bce-8d1de6b37483" containerName="registry-server" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.134001 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.140620 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snjmp"] Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.243186 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b45cff-61ac-4f22-ab7f-62a66134e64c-catalog-content\") pod \"redhat-operators-snjmp\" (UID: \"01b45cff-61ac-4f22-ab7f-62a66134e64c\") " pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.243247 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx5rc\" (UniqueName: \"kubernetes.io/projected/01b45cff-61ac-4f22-ab7f-62a66134e64c-kube-api-access-hx5rc\") pod \"redhat-operators-snjmp\" (UID: \"01b45cff-61ac-4f22-ab7f-62a66134e64c\") " pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.243906 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b45cff-61ac-4f22-ab7f-62a66134e64c-utilities\") pod \"redhat-operators-snjmp\" (UID: \"01b45cff-61ac-4f22-ab7f-62a66134e64c\") " pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.345286 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx5rc\" (UniqueName: \"kubernetes.io/projected/01b45cff-61ac-4f22-ab7f-62a66134e64c-kube-api-access-hx5rc\") pod \"redhat-operators-snjmp\" (UID: \"01b45cff-61ac-4f22-ab7f-62a66134e64c\") " pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.345380 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b45cff-61ac-4f22-ab7f-62a66134e64c-utilities\") pod \"redhat-operators-snjmp\" (UID: \"01b45cff-61ac-4f22-ab7f-62a66134e64c\") " pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.345553 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b45cff-61ac-4f22-ab7f-62a66134e64c-catalog-content\") pod \"redhat-operators-snjmp\" (UID: \"01b45cff-61ac-4f22-ab7f-62a66134e64c\") " pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.345971 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b45cff-61ac-4f22-ab7f-62a66134e64c-utilities\") pod \"redhat-operators-snjmp\" (UID: \"01b45cff-61ac-4f22-ab7f-62a66134e64c\") " pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.346056 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b45cff-61ac-4f22-ab7f-62a66134e64c-catalog-content\") pod \"redhat-operators-snjmp\" (UID: \"01b45cff-61ac-4f22-ab7f-62a66134e64c\") " pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.374752 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx5rc\" (UniqueName: \"kubernetes.io/projected/01b45cff-61ac-4f22-ab7f-62a66134e64c-kube-api-access-hx5rc\") pod \"redhat-operators-snjmp\" (UID: \"01b45cff-61ac-4f22-ab7f-62a66134e64c\") " pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.467442 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:48 crc kubenswrapper[4974]: I1013 18:38:48.925230 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-snjmp"] Oct 13 18:38:49 crc kubenswrapper[4974]: I1013 18:38:49.087087 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snjmp" event={"ID":"01b45cff-61ac-4f22-ab7f-62a66134e64c","Type":"ContainerStarted","Data":"418a8a2195a907eb58ac3ad4d4277c7a9b62a72b634275bfe18027d2b0ac3044"} Oct 13 18:38:50 crc kubenswrapper[4974]: I1013 18:38:50.106287 4974 generic.go:334] "Generic (PLEG): container finished" podID="01b45cff-61ac-4f22-ab7f-62a66134e64c" containerID="ba5a392de9767c3f9d79b28721070c37a7f67594571a3f2149130c39a28fd731" exitCode=0 Oct 13 18:38:50 crc kubenswrapper[4974]: I1013 18:38:50.106427 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snjmp" event={"ID":"01b45cff-61ac-4f22-ab7f-62a66134e64c","Type":"ContainerDied","Data":"ba5a392de9767c3f9d79b28721070c37a7f67594571a3f2149130c39a28fd731"} Oct 13 18:38:50 crc kubenswrapper[4974]: I1013 18:38:50.110284 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 18:38:52 crc kubenswrapper[4974]: I1013 18:38:52.137760 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snjmp" event={"ID":"01b45cff-61ac-4f22-ab7f-62a66134e64c","Type":"ContainerStarted","Data":"fed278ac8bcb7a500f9795871532d1615973702339828733efc5e42ad86014c0"} Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.518357 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lj477"] Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.520484 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.547738 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lj477"] Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.704185 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc712ccb-55e2-448d-b630-f7e8ad267db2-utilities\") pod \"certified-operators-lj477\" (UID: \"fc712ccb-55e2-448d-b630-f7e8ad267db2\") " pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.705291 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmsxv\" (UniqueName: \"kubernetes.io/projected/fc712ccb-55e2-448d-b630-f7e8ad267db2-kube-api-access-nmsxv\") pod \"certified-operators-lj477\" (UID: \"fc712ccb-55e2-448d-b630-f7e8ad267db2\") " pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.705425 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc712ccb-55e2-448d-b630-f7e8ad267db2-catalog-content\") pod \"certified-operators-lj477\" (UID: \"fc712ccb-55e2-448d-b630-f7e8ad267db2\") " pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.807341 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc712ccb-55e2-448d-b630-f7e8ad267db2-catalog-content\") pod \"certified-operators-lj477\" (UID: \"fc712ccb-55e2-448d-b630-f7e8ad267db2\") " pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.807421 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc712ccb-55e2-448d-b630-f7e8ad267db2-utilities\") pod \"certified-operators-lj477\" (UID: \"fc712ccb-55e2-448d-b630-f7e8ad267db2\") " pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.807544 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmsxv\" (UniqueName: \"kubernetes.io/projected/fc712ccb-55e2-448d-b630-f7e8ad267db2-kube-api-access-nmsxv\") pod \"certified-operators-lj477\" (UID: \"fc712ccb-55e2-448d-b630-f7e8ad267db2\") " pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.808244 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc712ccb-55e2-448d-b630-f7e8ad267db2-catalog-content\") pod \"certified-operators-lj477\" (UID: \"fc712ccb-55e2-448d-b630-f7e8ad267db2\") " pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.808454 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc712ccb-55e2-448d-b630-f7e8ad267db2-utilities\") pod \"certified-operators-lj477\" (UID: \"fc712ccb-55e2-448d-b630-f7e8ad267db2\") " pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.831467 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmsxv\" (UniqueName: \"kubernetes.io/projected/fc712ccb-55e2-448d-b630-f7e8ad267db2-kube-api-access-nmsxv\") pod \"certified-operators-lj477\" (UID: \"fc712ccb-55e2-448d-b630-f7e8ad267db2\") " pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:38:53 crc kubenswrapper[4974]: I1013 18:38:53.843977 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:38:54 crc kubenswrapper[4974]: I1013 18:38:54.338260 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lj477"] Oct 13 18:38:54 crc kubenswrapper[4974]: W1013 18:38:54.358367 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc712ccb_55e2_448d_b630_f7e8ad267db2.slice/crio-d6ed1a16f172e9dfaccbdb61f35870272063926efe8a33a0b9ac73e5fba210c3 WatchSource:0}: Error finding container d6ed1a16f172e9dfaccbdb61f35870272063926efe8a33a0b9ac73e5fba210c3: Status 404 returned error can't find the container with id d6ed1a16f172e9dfaccbdb61f35870272063926efe8a33a0b9ac73e5fba210c3 Oct 13 18:38:55 crc kubenswrapper[4974]: I1013 18:38:55.176074 4974 generic.go:334] "Generic (PLEG): container finished" podID="01b45cff-61ac-4f22-ab7f-62a66134e64c" containerID="fed278ac8bcb7a500f9795871532d1615973702339828733efc5e42ad86014c0" exitCode=0 Oct 13 18:38:55 crc kubenswrapper[4974]: I1013 18:38:55.176159 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snjmp" event={"ID":"01b45cff-61ac-4f22-ab7f-62a66134e64c","Type":"ContainerDied","Data":"fed278ac8bcb7a500f9795871532d1615973702339828733efc5e42ad86014c0"} Oct 13 18:38:55 crc kubenswrapper[4974]: I1013 18:38:55.179926 4974 generic.go:334] "Generic (PLEG): container finished" podID="fc712ccb-55e2-448d-b630-f7e8ad267db2" containerID="57d4f1007b01b759068137f0087c1aa005089773bf6b50edcc8abcb9f2a3289a" exitCode=0 Oct 13 18:38:55 crc kubenswrapper[4974]: I1013 18:38:55.179984 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj477" event={"ID":"fc712ccb-55e2-448d-b630-f7e8ad267db2","Type":"ContainerDied","Data":"57d4f1007b01b759068137f0087c1aa005089773bf6b50edcc8abcb9f2a3289a"} Oct 13 18:38:55 crc kubenswrapper[4974]: I1013 18:38:55.180027 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj477" event={"ID":"fc712ccb-55e2-448d-b630-f7e8ad267db2","Type":"ContainerStarted","Data":"d6ed1a16f172e9dfaccbdb61f35870272063926efe8a33a0b9ac73e5fba210c3"} Oct 13 18:38:56 crc kubenswrapper[4974]: I1013 18:38:56.190543 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snjmp" event={"ID":"01b45cff-61ac-4f22-ab7f-62a66134e64c","Type":"ContainerStarted","Data":"4bc8fe5d9e170ac20a24be2c835c20bda6685f8b951b55e0dc882a620e853ec0"} Oct 13 18:38:56 crc kubenswrapper[4974]: I1013 18:38:56.210271 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-snjmp" podStartSLOduration=2.666596801 podStartE2EDuration="8.210254929s" podCreationTimestamp="2025-10-13 18:38:48 +0000 UTC" firstStartedPulling="2025-10-13 18:38:50.109692704 +0000 UTC m=+1465.014058824" lastFinishedPulling="2025-10-13 18:38:55.653350872 +0000 UTC m=+1470.557716952" observedRunningTime="2025-10-13 18:38:56.20497228 +0000 UTC m=+1471.109338370" watchObservedRunningTime="2025-10-13 18:38:56.210254929 +0000 UTC m=+1471.114621009" Oct 13 18:38:57 crc kubenswrapper[4974]: I1013 18:38:57.209437 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj477" event={"ID":"fc712ccb-55e2-448d-b630-f7e8ad267db2","Type":"ContainerStarted","Data":"6cbb66b5e045545b30a5e4cc2e00ae91a213d2bb421ad5288242009c5bbb8f0c"} Oct 13 18:38:58 crc kubenswrapper[4974]: I1013 18:38:58.223967 4974 generic.go:334] "Generic (PLEG): container finished" podID="fc712ccb-55e2-448d-b630-f7e8ad267db2" containerID="6cbb66b5e045545b30a5e4cc2e00ae91a213d2bb421ad5288242009c5bbb8f0c" exitCode=0 Oct 13 18:38:58 crc kubenswrapper[4974]: I1013 18:38:58.224940 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj477" event={"ID":"fc712ccb-55e2-448d-b630-f7e8ad267db2","Type":"ContainerDied","Data":"6cbb66b5e045545b30a5e4cc2e00ae91a213d2bb421ad5288242009c5bbb8f0c"} Oct 13 18:38:58 crc kubenswrapper[4974]: I1013 18:38:58.467696 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:58 crc kubenswrapper[4974]: I1013 18:38:58.468365 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:38:59 crc kubenswrapper[4974]: I1013 18:38:59.238246 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj477" event={"ID":"fc712ccb-55e2-448d-b630-f7e8ad267db2","Type":"ContainerStarted","Data":"bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d"} Oct 13 18:38:59 crc kubenswrapper[4974]: I1013 18:38:59.271893 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lj477" podStartSLOduration=2.780469983 podStartE2EDuration="6.271868609s" podCreationTimestamp="2025-10-13 18:38:53 +0000 UTC" firstStartedPulling="2025-10-13 18:38:55.181923917 +0000 UTC m=+1470.086290007" lastFinishedPulling="2025-10-13 18:38:58.673322513 +0000 UTC m=+1473.577688633" observedRunningTime="2025-10-13 18:38:59.261120137 +0000 UTC m=+1474.165486227" watchObservedRunningTime="2025-10-13 18:38:59.271868609 +0000 UTC m=+1474.176234709" Oct 13 18:38:59 crc kubenswrapper[4974]: I1013 18:38:59.537815 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-snjmp" podUID="01b45cff-61ac-4f22-ab7f-62a66134e64c" containerName="registry-server" probeResult="failure" output=< Oct 13 18:38:59 crc kubenswrapper[4974]: timeout: failed to connect service ":50051" within 1s Oct 13 18:38:59 crc kubenswrapper[4974]: > Oct 13 18:39:03 crc kubenswrapper[4974]: I1013 18:39:03.845196 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:39:03 crc kubenswrapper[4974]: I1013 18:39:03.845615 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:39:03 crc kubenswrapper[4974]: I1013 18:39:03.911063 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:39:04 crc kubenswrapper[4974]: I1013 18:39:04.352132 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:39:04 crc kubenswrapper[4974]: I1013 18:39:04.412054 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lj477"] Oct 13 18:39:06 crc kubenswrapper[4974]: I1013 18:39:06.318483 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lj477" podUID="fc712ccb-55e2-448d-b630-f7e8ad267db2" containerName="registry-server" containerID="cri-o://bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d" gracePeriod=2 Oct 13 18:39:06 crc kubenswrapper[4974]: I1013 18:39:06.860831 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:39:06 crc kubenswrapper[4974]: I1013 18:39:06.974692 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc712ccb-55e2-448d-b630-f7e8ad267db2-utilities\") pod \"fc712ccb-55e2-448d-b630-f7e8ad267db2\" (UID: \"fc712ccb-55e2-448d-b630-f7e8ad267db2\") " Oct 13 18:39:06 crc kubenswrapper[4974]: I1013 18:39:06.974786 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmsxv\" (UniqueName: \"kubernetes.io/projected/fc712ccb-55e2-448d-b630-f7e8ad267db2-kube-api-access-nmsxv\") pod \"fc712ccb-55e2-448d-b630-f7e8ad267db2\" (UID: \"fc712ccb-55e2-448d-b630-f7e8ad267db2\") " Oct 13 18:39:06 crc kubenswrapper[4974]: I1013 18:39:06.974914 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc712ccb-55e2-448d-b630-f7e8ad267db2-catalog-content\") pod \"fc712ccb-55e2-448d-b630-f7e8ad267db2\" (UID: \"fc712ccb-55e2-448d-b630-f7e8ad267db2\") " Oct 13 18:39:06 crc kubenswrapper[4974]: I1013 18:39:06.975469 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc712ccb-55e2-448d-b630-f7e8ad267db2-utilities" (OuterVolumeSpecName: "utilities") pod "fc712ccb-55e2-448d-b630-f7e8ad267db2" (UID: "fc712ccb-55e2-448d-b630-f7e8ad267db2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:39:06 crc kubenswrapper[4974]: I1013 18:39:06.987901 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc712ccb-55e2-448d-b630-f7e8ad267db2-kube-api-access-nmsxv" (OuterVolumeSpecName: "kube-api-access-nmsxv") pod "fc712ccb-55e2-448d-b630-f7e8ad267db2" (UID: "fc712ccb-55e2-448d-b630-f7e8ad267db2"). InnerVolumeSpecName "kube-api-access-nmsxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.019710 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc712ccb-55e2-448d-b630-f7e8ad267db2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc712ccb-55e2-448d-b630-f7e8ad267db2" (UID: "fc712ccb-55e2-448d-b630-f7e8ad267db2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.077320 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc712ccb-55e2-448d-b630-f7e8ad267db2-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.077350 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmsxv\" (UniqueName: \"kubernetes.io/projected/fc712ccb-55e2-448d-b630-f7e8ad267db2-kube-api-access-nmsxv\") on node \"crc\" DevicePath \"\"" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.077361 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc712ccb-55e2-448d-b630-f7e8ad267db2-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.328854 4974 generic.go:334] "Generic (PLEG): container finished" podID="fc712ccb-55e2-448d-b630-f7e8ad267db2" containerID="bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d" exitCode=0 Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.328891 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj477" event={"ID":"fc712ccb-55e2-448d-b630-f7e8ad267db2","Type":"ContainerDied","Data":"bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d"} Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.328923 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lj477" event={"ID":"fc712ccb-55e2-448d-b630-f7e8ad267db2","Type":"ContainerDied","Data":"d6ed1a16f172e9dfaccbdb61f35870272063926efe8a33a0b9ac73e5fba210c3"} Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.328940 4974 scope.go:117] "RemoveContainer" containerID="bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.329821 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lj477" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.375205 4974 scope.go:117] "RemoveContainer" containerID="6cbb66b5e045545b30a5e4cc2e00ae91a213d2bb421ad5288242009c5bbb8f0c" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.394126 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lj477"] Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.404582 4974 scope.go:117] "RemoveContainer" containerID="57d4f1007b01b759068137f0087c1aa005089773bf6b50edcc8abcb9f2a3289a" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.416308 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lj477"] Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.456935 4974 scope.go:117] "RemoveContainer" containerID="bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d" Oct 13 18:39:07 crc kubenswrapper[4974]: E1013 18:39:07.457524 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d\": container with ID starting with bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d not found: ID does not exist" containerID="bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.457573 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d"} err="failed to get container status \"bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d\": rpc error: code = NotFound desc = could not find container \"bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d\": container with ID starting with bbbff734368400bec23eba1f09ecf6062b542070c370454fb54daec7e2b8c56d not found: ID does not exist" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.457607 4974 scope.go:117] "RemoveContainer" containerID="6cbb66b5e045545b30a5e4cc2e00ae91a213d2bb421ad5288242009c5bbb8f0c" Oct 13 18:39:07 crc kubenswrapper[4974]: E1013 18:39:07.458027 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cbb66b5e045545b30a5e4cc2e00ae91a213d2bb421ad5288242009c5bbb8f0c\": container with ID starting with 6cbb66b5e045545b30a5e4cc2e00ae91a213d2bb421ad5288242009c5bbb8f0c not found: ID does not exist" containerID="6cbb66b5e045545b30a5e4cc2e00ae91a213d2bb421ad5288242009c5bbb8f0c" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.458059 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cbb66b5e045545b30a5e4cc2e00ae91a213d2bb421ad5288242009c5bbb8f0c"} err="failed to get container status \"6cbb66b5e045545b30a5e4cc2e00ae91a213d2bb421ad5288242009c5bbb8f0c\": rpc error: code = NotFound desc = could not find container \"6cbb66b5e045545b30a5e4cc2e00ae91a213d2bb421ad5288242009c5bbb8f0c\": container with ID starting with 6cbb66b5e045545b30a5e4cc2e00ae91a213d2bb421ad5288242009c5bbb8f0c not found: ID does not exist" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.458077 4974 scope.go:117] "RemoveContainer" containerID="57d4f1007b01b759068137f0087c1aa005089773bf6b50edcc8abcb9f2a3289a" Oct 13 18:39:07 crc kubenswrapper[4974]: E1013 18:39:07.458409 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57d4f1007b01b759068137f0087c1aa005089773bf6b50edcc8abcb9f2a3289a\": container with ID starting with 57d4f1007b01b759068137f0087c1aa005089773bf6b50edcc8abcb9f2a3289a not found: ID does not exist" containerID="57d4f1007b01b759068137f0087c1aa005089773bf6b50edcc8abcb9f2a3289a" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.458436 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57d4f1007b01b759068137f0087c1aa005089773bf6b50edcc8abcb9f2a3289a"} err="failed to get container status \"57d4f1007b01b759068137f0087c1aa005089773bf6b50edcc8abcb9f2a3289a\": rpc error: code = NotFound desc = could not find container \"57d4f1007b01b759068137f0087c1aa005089773bf6b50edcc8abcb9f2a3289a\": container with ID starting with 57d4f1007b01b759068137f0087c1aa005089773bf6b50edcc8abcb9f2a3289a not found: ID does not exist" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.743726 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.743821 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.743896 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.745189 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ea39e73c9d324740300df4165ffa4e49cea26ef15903575dd3e89904e7bd3e53"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.745307 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://ea39e73c9d324740300df4165ffa4e49cea26ef15903575dd3e89904e7bd3e53" gracePeriod=600 Oct 13 18:39:07 crc kubenswrapper[4974]: I1013 18:39:07.827261 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc712ccb-55e2-448d-b630-f7e8ad267db2" path="/var/lib/kubelet/pods/fc712ccb-55e2-448d-b630-f7e8ad267db2/volumes" Oct 13 18:39:08 crc kubenswrapper[4974]: I1013 18:39:08.345380 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="ea39e73c9d324740300df4165ffa4e49cea26ef15903575dd3e89904e7bd3e53" exitCode=0 Oct 13 18:39:08 crc kubenswrapper[4974]: I1013 18:39:08.345511 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"ea39e73c9d324740300df4165ffa4e49cea26ef15903575dd3e89904e7bd3e53"} Oct 13 18:39:08 crc kubenswrapper[4974]: I1013 18:39:08.345813 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2"} Oct 13 18:39:08 crc kubenswrapper[4974]: I1013 18:39:08.345842 4974 scope.go:117] "RemoveContainer" containerID="5a8943cc34f896ee3b60b9837ff3add67567a5a52b9b5a1adbb600a9ed07e274" Oct 13 18:39:08 crc kubenswrapper[4974]: I1013 18:39:08.531640 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:39:08 crc kubenswrapper[4974]: I1013 18:39:08.605660 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:39:09 crc kubenswrapper[4974]: I1013 18:39:09.561906 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snjmp"] Oct 13 18:39:10 crc kubenswrapper[4974]: I1013 18:39:10.385419 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-snjmp" podUID="01b45cff-61ac-4f22-ab7f-62a66134e64c" containerName="registry-server" containerID="cri-o://4bc8fe5d9e170ac20a24be2c835c20bda6685f8b951b55e0dc882a620e853ec0" gracePeriod=2 Oct 13 18:39:11 crc kubenswrapper[4974]: I1013 18:39:11.428475 4974 generic.go:334] "Generic (PLEG): container finished" podID="01b45cff-61ac-4f22-ab7f-62a66134e64c" containerID="4bc8fe5d9e170ac20a24be2c835c20bda6685f8b951b55e0dc882a620e853ec0" exitCode=0 Oct 13 18:39:11 crc kubenswrapper[4974]: I1013 18:39:11.428921 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snjmp" event={"ID":"01b45cff-61ac-4f22-ab7f-62a66134e64c","Type":"ContainerDied","Data":"4bc8fe5d9e170ac20a24be2c835c20bda6685f8b951b55e0dc882a620e853ec0"} Oct 13 18:39:11 crc kubenswrapper[4974]: I1013 18:39:11.513295 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:39:11 crc kubenswrapper[4974]: I1013 18:39:11.582443 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx5rc\" (UniqueName: \"kubernetes.io/projected/01b45cff-61ac-4f22-ab7f-62a66134e64c-kube-api-access-hx5rc\") pod \"01b45cff-61ac-4f22-ab7f-62a66134e64c\" (UID: \"01b45cff-61ac-4f22-ab7f-62a66134e64c\") " Oct 13 18:39:11 crc kubenswrapper[4974]: I1013 18:39:11.582650 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b45cff-61ac-4f22-ab7f-62a66134e64c-catalog-content\") pod \"01b45cff-61ac-4f22-ab7f-62a66134e64c\" (UID: \"01b45cff-61ac-4f22-ab7f-62a66134e64c\") " Oct 13 18:39:11 crc kubenswrapper[4974]: I1013 18:39:11.582751 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b45cff-61ac-4f22-ab7f-62a66134e64c-utilities\") pod \"01b45cff-61ac-4f22-ab7f-62a66134e64c\" (UID: \"01b45cff-61ac-4f22-ab7f-62a66134e64c\") " Oct 13 18:39:11 crc kubenswrapper[4974]: I1013 18:39:11.583366 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b45cff-61ac-4f22-ab7f-62a66134e64c-utilities" (OuterVolumeSpecName: "utilities") pod "01b45cff-61ac-4f22-ab7f-62a66134e64c" (UID: "01b45cff-61ac-4f22-ab7f-62a66134e64c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:39:11 crc kubenswrapper[4974]: I1013 18:39:11.588016 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01b45cff-61ac-4f22-ab7f-62a66134e64c-kube-api-access-hx5rc" (OuterVolumeSpecName: "kube-api-access-hx5rc") pod "01b45cff-61ac-4f22-ab7f-62a66134e64c" (UID: "01b45cff-61ac-4f22-ab7f-62a66134e64c"). InnerVolumeSpecName "kube-api-access-hx5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:39:11 crc kubenswrapper[4974]: I1013 18:39:11.665180 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01b45cff-61ac-4f22-ab7f-62a66134e64c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01b45cff-61ac-4f22-ab7f-62a66134e64c" (UID: "01b45cff-61ac-4f22-ab7f-62a66134e64c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:39:11 crc kubenswrapper[4974]: I1013 18:39:11.685580 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx5rc\" (UniqueName: \"kubernetes.io/projected/01b45cff-61ac-4f22-ab7f-62a66134e64c-kube-api-access-hx5rc\") on node \"crc\" DevicePath \"\"" Oct 13 18:39:11 crc kubenswrapper[4974]: I1013 18:39:11.685852 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01b45cff-61ac-4f22-ab7f-62a66134e64c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:39:11 crc kubenswrapper[4974]: I1013 18:39:11.685978 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01b45cff-61ac-4f22-ab7f-62a66134e64c-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:39:12 crc kubenswrapper[4974]: I1013 18:39:12.445113 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-snjmp" event={"ID":"01b45cff-61ac-4f22-ab7f-62a66134e64c","Type":"ContainerDied","Data":"418a8a2195a907eb58ac3ad4d4277c7a9b62a72b634275bfe18027d2b0ac3044"} Oct 13 18:39:12 crc kubenswrapper[4974]: I1013 18:39:12.445190 4974 scope.go:117] "RemoveContainer" containerID="4bc8fe5d9e170ac20a24be2c835c20bda6685f8b951b55e0dc882a620e853ec0" Oct 13 18:39:12 crc kubenswrapper[4974]: I1013 18:39:12.445210 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-snjmp" Oct 13 18:39:12 crc kubenswrapper[4974]: I1013 18:39:12.469933 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-snjmp"] Oct 13 18:39:12 crc kubenswrapper[4974]: I1013 18:39:12.476942 4974 scope.go:117] "RemoveContainer" containerID="fed278ac8bcb7a500f9795871532d1615973702339828733efc5e42ad86014c0" Oct 13 18:39:12 crc kubenswrapper[4974]: I1013 18:39:12.481813 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-snjmp"] Oct 13 18:39:12 crc kubenswrapper[4974]: I1013 18:39:12.507488 4974 scope.go:117] "RemoveContainer" containerID="ba5a392de9767c3f9d79b28721070c37a7f67594571a3f2149130c39a28fd731" Oct 13 18:39:13 crc kubenswrapper[4974]: I1013 18:39:13.822616 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01b45cff-61ac-4f22-ab7f-62a66134e64c" path="/var/lib/kubelet/pods/01b45cff-61ac-4f22-ab7f-62a66134e64c/volumes" Oct 13 18:39:46 crc kubenswrapper[4974]: I1013 18:39:46.843289 4974 scope.go:117] "RemoveContainer" containerID="96cd22203fba9524159bc61ad032f605190b0b5f60e6571e95d8d40617d712f5" Oct 13 18:39:46 crc kubenswrapper[4974]: I1013 18:39:46.875790 4974 scope.go:117] "RemoveContainer" containerID="1241cb87cad276d93066d5ab9659eb3ac6b16d2c9d8fc4680822ca212cacbc47" Oct 13 18:39:54 crc kubenswrapper[4974]: I1013 18:39:54.991337 4974 generic.go:334] "Generic (PLEG): container finished" podID="684a8cdf-df17-41a7-87b8-9027cb982025" containerID="43aa7a036a9dff60ec4567895a41b1d2fc39d5f5f11ce21efa621057fbba8b85" exitCode=0 Oct 13 18:39:54 crc kubenswrapper[4974]: I1013 18:39:54.991450 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" event={"ID":"684a8cdf-df17-41a7-87b8-9027cb982025","Type":"ContainerDied","Data":"43aa7a036a9dff60ec4567895a41b1d2fc39d5f5f11ce21efa621057fbba8b85"} Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.493016 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.581701 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-ssh-key\") pod \"684a8cdf-df17-41a7-87b8-9027cb982025\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.581765 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-inventory\") pod \"684a8cdf-df17-41a7-87b8-9027cb982025\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.581841 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-bootstrap-combined-ca-bundle\") pod \"684a8cdf-df17-41a7-87b8-9027cb982025\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.581899 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxx4d\" (UniqueName: \"kubernetes.io/projected/684a8cdf-df17-41a7-87b8-9027cb982025-kube-api-access-qxx4d\") pod \"684a8cdf-df17-41a7-87b8-9027cb982025\" (UID: \"684a8cdf-df17-41a7-87b8-9027cb982025\") " Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.587425 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/684a8cdf-df17-41a7-87b8-9027cb982025-kube-api-access-qxx4d" (OuterVolumeSpecName: "kube-api-access-qxx4d") pod "684a8cdf-df17-41a7-87b8-9027cb982025" (UID: "684a8cdf-df17-41a7-87b8-9027cb982025"). InnerVolumeSpecName "kube-api-access-qxx4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.587795 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "684a8cdf-df17-41a7-87b8-9027cb982025" (UID: "684a8cdf-df17-41a7-87b8-9027cb982025"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.610235 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "684a8cdf-df17-41a7-87b8-9027cb982025" (UID: "684a8cdf-df17-41a7-87b8-9027cb982025"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.621699 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-inventory" (OuterVolumeSpecName: "inventory") pod "684a8cdf-df17-41a7-87b8-9027cb982025" (UID: "684a8cdf-df17-41a7-87b8-9027cb982025"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.683800 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.683854 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.683873 4974 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/684a8cdf-df17-41a7-87b8-9027cb982025-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:39:56 crc kubenswrapper[4974]: I1013 18:39:56.683892 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxx4d\" (UniqueName: \"kubernetes.io/projected/684a8cdf-df17-41a7-87b8-9027cb982025-kube-api-access-qxx4d\") on node \"crc\" DevicePath \"\"" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.017585 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" event={"ID":"684a8cdf-df17-41a7-87b8-9027cb982025","Type":"ContainerDied","Data":"cfc0f62d2be9821c176ab9b13f51433283d6f59c30531bf0a4589c4a86a45d23"} Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.017993 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfc0f62d2be9821c176ab9b13f51433283d6f59c30531bf0a4589c4a86a45d23" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.017639 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.114300 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms"] Oct 13 18:39:57 crc kubenswrapper[4974]: E1013 18:39:57.114883 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b45cff-61ac-4f22-ab7f-62a66134e64c" containerName="registry-server" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.114935 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b45cff-61ac-4f22-ab7f-62a66134e64c" containerName="registry-server" Oct 13 18:39:57 crc kubenswrapper[4974]: E1013 18:39:57.114962 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc712ccb-55e2-448d-b630-f7e8ad267db2" containerName="registry-server" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.114973 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc712ccb-55e2-448d-b630-f7e8ad267db2" containerName="registry-server" Oct 13 18:39:57 crc kubenswrapper[4974]: E1013 18:39:57.114999 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="684a8cdf-df17-41a7-87b8-9027cb982025" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.115013 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="684a8cdf-df17-41a7-87b8-9027cb982025" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 13 18:39:57 crc kubenswrapper[4974]: E1013 18:39:57.115037 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b45cff-61ac-4f22-ab7f-62a66134e64c" containerName="extract-utilities" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.115048 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b45cff-61ac-4f22-ab7f-62a66134e64c" containerName="extract-utilities" Oct 13 18:39:57 crc kubenswrapper[4974]: E1013 18:39:57.115070 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc712ccb-55e2-448d-b630-f7e8ad267db2" containerName="extract-content" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.115081 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc712ccb-55e2-448d-b630-f7e8ad267db2" containerName="extract-content" Oct 13 18:39:57 crc kubenswrapper[4974]: E1013 18:39:57.115109 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01b45cff-61ac-4f22-ab7f-62a66134e64c" containerName="extract-content" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.115120 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="01b45cff-61ac-4f22-ab7f-62a66134e64c" containerName="extract-content" Oct 13 18:39:57 crc kubenswrapper[4974]: E1013 18:39:57.115151 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc712ccb-55e2-448d-b630-f7e8ad267db2" containerName="extract-utilities" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.115161 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc712ccb-55e2-448d-b630-f7e8ad267db2" containerName="extract-utilities" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.115500 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc712ccb-55e2-448d-b630-f7e8ad267db2" containerName="registry-server" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.115532 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="684a8cdf-df17-41a7-87b8-9027cb982025" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.115575 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="01b45cff-61ac-4f22-ab7f-62a66134e64c" containerName="registry-server" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.116718 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.123284 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.123461 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.123540 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.123771 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.129442 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms"] Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.192177 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dec36c0a-5335-4f2c-9582-ccd2c8f30207-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8lbms\" (UID: \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.192284 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dec36c0a-5335-4f2c-9582-ccd2c8f30207-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8lbms\" (UID: \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.192313 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgfqv\" (UniqueName: \"kubernetes.io/projected/dec36c0a-5335-4f2c-9582-ccd2c8f30207-kube-api-access-fgfqv\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8lbms\" (UID: \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.294201 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dec36c0a-5335-4f2c-9582-ccd2c8f30207-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8lbms\" (UID: \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.294265 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgfqv\" (UniqueName: \"kubernetes.io/projected/dec36c0a-5335-4f2c-9582-ccd2c8f30207-kube-api-access-fgfqv\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8lbms\" (UID: \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.294388 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dec36c0a-5335-4f2c-9582-ccd2c8f30207-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8lbms\" (UID: \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.300063 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dec36c0a-5335-4f2c-9582-ccd2c8f30207-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8lbms\" (UID: \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.300187 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dec36c0a-5335-4f2c-9582-ccd2c8f30207-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8lbms\" (UID: \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.323965 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgfqv\" (UniqueName: \"kubernetes.io/projected/dec36c0a-5335-4f2c-9582-ccd2c8f30207-kube-api-access-fgfqv\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-8lbms\" (UID: \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.436840 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:39:57 crc kubenswrapper[4974]: I1013 18:39:57.780679 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms"] Oct 13 18:39:58 crc kubenswrapper[4974]: I1013 18:39:58.037207 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" event={"ID":"dec36c0a-5335-4f2c-9582-ccd2c8f30207","Type":"ContainerStarted","Data":"31d2772dd22bfd09499c26549b57674036c1f4b00c03a3b879d982cf4a0205fc"} Oct 13 18:39:59 crc kubenswrapper[4974]: I1013 18:39:59.053251 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" event={"ID":"dec36c0a-5335-4f2c-9582-ccd2c8f30207","Type":"ContainerStarted","Data":"8a1d8b4c066890648e1c1ffecc875cb60cab0205c66e4b7432655d90d8603ee2"} Oct 13 18:39:59 crc kubenswrapper[4974]: I1013 18:39:59.072839 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" podStartSLOduration=1.409908309 podStartE2EDuration="2.072816707s" podCreationTimestamp="2025-10-13 18:39:57 +0000 UTC" firstStartedPulling="2025-10-13 18:39:57.785534359 +0000 UTC m=+1532.689900439" lastFinishedPulling="2025-10-13 18:39:58.448442717 +0000 UTC m=+1533.352808837" observedRunningTime="2025-10-13 18:39:59.067840436 +0000 UTC m=+1533.972206526" watchObservedRunningTime="2025-10-13 18:39:59.072816707 +0000 UTC m=+1533.977182807" Oct 13 18:40:07 crc kubenswrapper[4974]: I1013 18:40:07.046013 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-mb5k8"] Oct 13 18:40:07 crc kubenswrapper[4974]: I1013 18:40:07.056464 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-jmszb"] Oct 13 18:40:07 crc kubenswrapper[4974]: I1013 18:40:07.066491 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-jmszb"] Oct 13 18:40:07 crc kubenswrapper[4974]: I1013 18:40:07.075252 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-mb5k8"] Oct 13 18:40:07 crc kubenswrapper[4974]: I1013 18:40:07.833975 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4ff600a-6c11-4019-93a5-efe02ce706ab" path="/var/lib/kubelet/pods/c4ff600a-6c11-4019-93a5-efe02ce706ab/volumes" Oct 13 18:40:07 crc kubenswrapper[4974]: I1013 18:40:07.835317 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="feef59e0-044c-4c9b-b496-44c30b067129" path="/var/lib/kubelet/pods/feef59e0-044c-4c9b-b496-44c30b067129/volumes" Oct 13 18:40:10 crc kubenswrapper[4974]: I1013 18:40:10.034099 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-5z6xn"] Oct 13 18:40:10 crc kubenswrapper[4974]: I1013 18:40:10.048627 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-5z6xn"] Oct 13 18:40:11 crc kubenswrapper[4974]: I1013 18:40:11.838618 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="534daef5-5335-4b53-ba13-980e53570cd7" path="/var/lib/kubelet/pods/534daef5-5335-4b53-ba13-980e53570cd7/volumes" Oct 13 18:40:21 crc kubenswrapper[4974]: I1013 18:40:21.075701 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-510b-account-create-dtxwp"] Oct 13 18:40:21 crc kubenswrapper[4974]: I1013 18:40:21.091915 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-c003-account-create-242fn"] Oct 13 18:40:21 crc kubenswrapper[4974]: I1013 18:40:21.102632 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a9aa-account-create-zkllw"] Oct 13 18:40:21 crc kubenswrapper[4974]: I1013 18:40:21.111896 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-510b-account-create-dtxwp"] Oct 13 18:40:21 crc kubenswrapper[4974]: I1013 18:40:21.118558 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a9aa-account-create-zkllw"] Oct 13 18:40:21 crc kubenswrapper[4974]: I1013 18:40:21.126284 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-c003-account-create-242fn"] Oct 13 18:40:21 crc kubenswrapper[4974]: I1013 18:40:21.861052 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="756682a7-9eac-443f-a6a0-2d979d1268d3" path="/var/lib/kubelet/pods/756682a7-9eac-443f-a6a0-2d979d1268d3/volumes" Oct 13 18:40:21 crc kubenswrapper[4974]: I1013 18:40:21.862277 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76e96c17-0dd3-4925-b4f5-cf2ee96b2868" path="/var/lib/kubelet/pods/76e96c17-0dd3-4925-b4f5-cf2ee96b2868/volumes" Oct 13 18:40:21 crc kubenswrapper[4974]: I1013 18:40:21.864389 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="892baf7e-37ac-4364-9122-418e1997266d" path="/var/lib/kubelet/pods/892baf7e-37ac-4364-9122-418e1997266d/volumes" Oct 13 18:40:45 crc kubenswrapper[4974]: I1013 18:40:45.056417 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-7mkgf"] Oct 13 18:40:45 crc kubenswrapper[4974]: I1013 18:40:45.068612 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-7mkgf"] Oct 13 18:40:45 crc kubenswrapper[4974]: I1013 18:40:45.825485 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b802023-9db0-4145-8872-6fb2fb3f26e3" path="/var/lib/kubelet/pods/7b802023-9db0-4145-8872-6fb2fb3f26e3/volumes" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.016718 4974 scope.go:117] "RemoveContainer" containerID="c52aabfef332cc4dda8b2cfdd1c2e5a3f1f3030ff41a4fe490ffb8cd9fdc98f4" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.064817 4974 scope.go:117] "RemoveContainer" containerID="b6a59ad96a06bff61b6c41d0f84c5b47a2ef3391d0dc087811c2809431789612" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.085447 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-pwgg6"] Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.097496 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-zqz9v"] Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.098156 4974 scope.go:117] "RemoveContainer" containerID="bb3167f020d19fea7d8b23c25b06afa20cc989e85a9cfa01936ee994b7a2a677" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.105372 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-r947d"] Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.114590 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-pwgg6"] Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.121183 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-r947d"] Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.127827 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-zqz9v"] Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.172851 4974 scope.go:117] "RemoveContainer" containerID="3e2285772d7905b9bda0126dc94d24d559183db00d7b16cfb8c16c8db10d82a4" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.222494 4974 scope.go:117] "RemoveContainer" containerID="29f125fc4dce659e6f293987d08ade7a2608d844f008f3a491681cb5103d0077" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.251556 4974 scope.go:117] "RemoveContainer" containerID="a23f544c94efd54b1575365a3a1b38c4f701ae96577e618eaf603f909d5a4a23" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.282351 4974 scope.go:117] "RemoveContainer" containerID="016d41110f2a4b9c980c6704250037c47564e05b346e841c0eb611c384e8cac5" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.343020 4974 scope.go:117] "RemoveContainer" containerID="73f9fdfd8f691ddd6d02816c89d10cb653aa773d81715f9cf6e195f5a320f77f" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.390915 4974 scope.go:117] "RemoveContainer" containerID="1a36eeda5829317cf41e2a570a027aa0cd0a3ed05abb0a975bead2c2ac42018e" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.433198 4974 scope.go:117] "RemoveContainer" containerID="48c02468a7a7d14686cf919f3fcb7c891bd4bf746c5701ec1d940a02a1527db1" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.476409 4974 scope.go:117] "RemoveContainer" containerID="30c4533ea9b2d1d827cad3870c8a2364fcba5120f0d88a2158276d93e8e5e5d4" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.503906 4974 scope.go:117] "RemoveContainer" containerID="aed60a5c13f637f89fa1e08e9857003f10646a88f4d3cd9ff45783c355390c62" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.830450 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="723dbb39-0ec4-4193-8fb9-307d311f962e" path="/var/lib/kubelet/pods/723dbb39-0ec4-4193-8fb9-307d311f962e/volumes" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.831861 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="839f3919-be07-4c53-9d6d-92e2ad6d0059" path="/var/lib/kubelet/pods/839f3919-be07-4c53-9d6d-92e2ad6d0059/volumes" Oct 13 18:40:47 crc kubenswrapper[4974]: I1013 18:40:47.832976 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff4b736-2214-4791-b0c5-6d909c396d53" path="/var/lib/kubelet/pods/fff4b736-2214-4791-b0c5-6d909c396d53/volumes" Oct 13 18:40:59 crc kubenswrapper[4974]: I1013 18:40:59.044814 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-27f6-account-create-42tjc"] Oct 13 18:40:59 crc kubenswrapper[4974]: I1013 18:40:59.062603 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-cb6f-account-create-2qvkf"] Oct 13 18:40:59 crc kubenswrapper[4974]: I1013 18:40:59.075207 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-d9de-account-create-lw89h"] Oct 13 18:40:59 crc kubenswrapper[4974]: I1013 18:40:59.088512 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-ccf2-account-create-5h2vg"] Oct 13 18:40:59 crc kubenswrapper[4974]: I1013 18:40:59.098644 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-cb6f-account-create-2qvkf"] Oct 13 18:40:59 crc kubenswrapper[4974]: I1013 18:40:59.105514 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-27f6-account-create-42tjc"] Oct 13 18:40:59 crc kubenswrapper[4974]: I1013 18:40:59.112445 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-d9de-account-create-lw89h"] Oct 13 18:40:59 crc kubenswrapper[4974]: I1013 18:40:59.119321 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-ccf2-account-create-5h2vg"] Oct 13 18:40:59 crc kubenswrapper[4974]: I1013 18:40:59.830929 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b" path="/var/lib/kubelet/pods/0f6785f8-7dd2-4a77-8e6c-437cb7b2dd6b/volumes" Oct 13 18:40:59 crc kubenswrapper[4974]: I1013 18:40:59.831415 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c510728-c859-42b5-ac31-a8c1c9b10b75" path="/var/lib/kubelet/pods/1c510728-c859-42b5-ac31-a8c1c9b10b75/volumes" Oct 13 18:40:59 crc kubenswrapper[4974]: I1013 18:40:59.831877 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f298626-ae2a-4e3d-8c73-f1813c3aa22d" path="/var/lib/kubelet/pods/4f298626-ae2a-4e3d-8c73-f1813c3aa22d/volumes" Oct 13 18:40:59 crc kubenswrapper[4974]: I1013 18:40:59.832354 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a93751bb-94e4-430a-8e5c-2c5dd63bb013" path="/var/lib/kubelet/pods/a93751bb-94e4-430a-8e5c-2c5dd63bb013/volumes" Oct 13 18:41:00 crc kubenswrapper[4974]: I1013 18:41:00.074364 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-f52xm"] Oct 13 18:41:00 crc kubenswrapper[4974]: I1013 18:41:00.089666 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-f52xm"] Oct 13 18:41:01 crc kubenswrapper[4974]: I1013 18:41:01.044934 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-p8g7f"] Oct 13 18:41:01 crc kubenswrapper[4974]: I1013 18:41:01.058544 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-p8g7f"] Oct 13 18:41:01 crc kubenswrapper[4974]: I1013 18:41:01.837918 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f53b24c-9498-4e81-815b-9817e4be03be" path="/var/lib/kubelet/pods/4f53b24c-9498-4e81-815b-9817e4be03be/volumes" Oct 13 18:41:01 crc kubenswrapper[4974]: I1013 18:41:01.841348 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84eeba03-fc43-4027-8528-dc161a147dfb" path="/var/lib/kubelet/pods/84eeba03-fc43-4027-8528-dc161a147dfb/volumes" Oct 13 18:41:37 crc kubenswrapper[4974]: I1013 18:41:37.743834 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:41:37 crc kubenswrapper[4974]: I1013 18:41:37.744630 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:41:38 crc kubenswrapper[4974]: I1013 18:41:38.341160 4974 generic.go:334] "Generic (PLEG): container finished" podID="dec36c0a-5335-4f2c-9582-ccd2c8f30207" containerID="8a1d8b4c066890648e1c1ffecc875cb60cab0205c66e4b7432655d90d8603ee2" exitCode=0 Oct 13 18:41:38 crc kubenswrapper[4974]: I1013 18:41:38.341219 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" event={"ID":"dec36c0a-5335-4f2c-9582-ccd2c8f30207","Type":"ContainerDied","Data":"8a1d8b4c066890648e1c1ffecc875cb60cab0205c66e4b7432655d90d8603ee2"} Oct 13 18:41:39 crc kubenswrapper[4974]: I1013 18:41:39.831353 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.005013 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgfqv\" (UniqueName: \"kubernetes.io/projected/dec36c0a-5335-4f2c-9582-ccd2c8f30207-kube-api-access-fgfqv\") pod \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\" (UID: \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\") " Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.005097 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dec36c0a-5335-4f2c-9582-ccd2c8f30207-ssh-key\") pod \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\" (UID: \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\") " Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.005365 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dec36c0a-5335-4f2c-9582-ccd2c8f30207-inventory\") pod \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\" (UID: \"dec36c0a-5335-4f2c-9582-ccd2c8f30207\") " Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.010949 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec36c0a-5335-4f2c-9582-ccd2c8f30207-kube-api-access-fgfqv" (OuterVolumeSpecName: "kube-api-access-fgfqv") pod "dec36c0a-5335-4f2c-9582-ccd2c8f30207" (UID: "dec36c0a-5335-4f2c-9582-ccd2c8f30207"). InnerVolumeSpecName "kube-api-access-fgfqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.043967 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec36c0a-5335-4f2c-9582-ccd2c8f30207-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dec36c0a-5335-4f2c-9582-ccd2c8f30207" (UID: "dec36c0a-5335-4f2c-9582-ccd2c8f30207"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.046471 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec36c0a-5335-4f2c-9582-ccd2c8f30207-inventory" (OuterVolumeSpecName: "inventory") pod "dec36c0a-5335-4f2c-9582-ccd2c8f30207" (UID: "dec36c0a-5335-4f2c-9582-ccd2c8f30207"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.107756 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dec36c0a-5335-4f2c-9582-ccd2c8f30207-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.107833 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dec36c0a-5335-4f2c-9582-ccd2c8f30207-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.107853 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgfqv\" (UniqueName: \"kubernetes.io/projected/dec36c0a-5335-4f2c-9582-ccd2c8f30207-kube-api-access-fgfqv\") on node \"crc\" DevicePath \"\"" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.371153 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" event={"ID":"dec36c0a-5335-4f2c-9582-ccd2c8f30207","Type":"ContainerDied","Data":"31d2772dd22bfd09499c26549b57674036c1f4b00c03a3b879d982cf4a0205fc"} Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.371211 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-8lbms" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.371224 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31d2772dd22bfd09499c26549b57674036c1f4b00c03a3b879d982cf4a0205fc" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.527107 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9"] Oct 13 18:41:40 crc kubenswrapper[4974]: E1013 18:41:40.527921 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec36c0a-5335-4f2c-9582-ccd2c8f30207" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.527974 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec36c0a-5335-4f2c-9582-ccd2c8f30207" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.528476 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec36c0a-5335-4f2c-9582-ccd2c8f30207" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.529829 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.534946 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.535600 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.535628 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.536565 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.542858 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9"] Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.617582 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r86ql\" (UniqueName: \"kubernetes.io/projected/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-kube-api-access-r86ql\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9\" (UID: \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.618164 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9\" (UID: \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.618283 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9\" (UID: \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.719229 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r86ql\" (UniqueName: \"kubernetes.io/projected/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-kube-api-access-r86ql\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9\" (UID: \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.719401 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9\" (UID: \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.719457 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9\" (UID: \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.726859 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9\" (UID: \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.729843 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9\" (UID: \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.749305 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r86ql\" (UniqueName: \"kubernetes.io/projected/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-kube-api-access-r86ql\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9\" (UID: \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:41:40 crc kubenswrapper[4974]: I1013 18:41:40.853985 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:41:41 crc kubenswrapper[4974]: I1013 18:41:41.055235 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-w9hqc"] Oct 13 18:41:41 crc kubenswrapper[4974]: I1013 18:41:41.063453 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-w9hqc"] Oct 13 18:41:41 crc kubenswrapper[4974]: I1013 18:41:41.536634 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9"] Oct 13 18:41:41 crc kubenswrapper[4974]: I1013 18:41:41.835313 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f038500-ef77-401b-9e63-755ea9a56695" path="/var/lib/kubelet/pods/2f038500-ef77-401b-9e63-755ea9a56695/volumes" Oct 13 18:41:42 crc kubenswrapper[4974]: I1013 18:41:42.395520 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" event={"ID":"fedc6dd9-1f4c-43f4-9e0b-74292be529a6","Type":"ContainerStarted","Data":"4c2f2b3a0554eb6f8c66ee6b0c82e83be7f022d10134975166b29fc2aed4b37f"} Oct 13 18:41:43 crc kubenswrapper[4974]: I1013 18:41:43.406692 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" event={"ID":"fedc6dd9-1f4c-43f4-9e0b-74292be529a6","Type":"ContainerStarted","Data":"6ceff627d9212f8a0d8a69c7c149787e389196fcce953093acf8d02d2311f9d0"} Oct 13 18:41:43 crc kubenswrapper[4974]: I1013 18:41:43.438064 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" podStartSLOduration=2.765001713 podStartE2EDuration="3.438043489s" podCreationTimestamp="2025-10-13 18:41:40 +0000 UTC" firstStartedPulling="2025-10-13 18:41:41.543116887 +0000 UTC m=+1636.447483007" lastFinishedPulling="2025-10-13 18:41:42.216158673 +0000 UTC m=+1637.120524783" observedRunningTime="2025-10-13 18:41:43.427987385 +0000 UTC m=+1638.332353475" watchObservedRunningTime="2025-10-13 18:41:43.438043489 +0000 UTC m=+1638.342409579" Oct 13 18:41:47 crc kubenswrapper[4974]: I1013 18:41:47.727561 4974 scope.go:117] "RemoveContainer" containerID="852316323f978c82b4f75ca76cf975a0b30f46ec83deea56da2ba716de24ccd2" Oct 13 18:41:47 crc kubenswrapper[4974]: I1013 18:41:47.785185 4974 scope.go:117] "RemoveContainer" containerID="aed0e2a969ca344755ad2a72349ef12ddb8d5b64a78579e6a544fd756c049ac3" Oct 13 18:41:47 crc kubenswrapper[4974]: I1013 18:41:47.851090 4974 scope.go:117] "RemoveContainer" containerID="4291043869c587289116d62157b7fcc81418609dbb592877d2acd2295ba448a8" Oct 13 18:41:47 crc kubenswrapper[4974]: I1013 18:41:47.904436 4974 scope.go:117] "RemoveContainer" containerID="8842edcc3e9698c59e0930af911f9443192bd37bd3d08e35af227c57ff1977e0" Oct 13 18:41:47 crc kubenswrapper[4974]: I1013 18:41:47.947015 4974 scope.go:117] "RemoveContainer" containerID="d304d035e40cf3ccdd1c00da88f92fa05765850c29b4a27741add4ec95d53776" Oct 13 18:41:47 crc kubenswrapper[4974]: I1013 18:41:47.994866 4974 scope.go:117] "RemoveContainer" containerID="0db80815fbfe5f9e722c5f0be136e8718e2486ad22aa7a3e5c67111ffb7947a2" Oct 13 18:41:48 crc kubenswrapper[4974]: I1013 18:41:48.054358 4974 scope.go:117] "RemoveContainer" containerID="cf7079c43af91ccf2039a49d5c815174b0bc871585f8c595fe21f4e15f59748c" Oct 13 18:41:48 crc kubenswrapper[4974]: I1013 18:41:48.075465 4974 scope.go:117] "RemoveContainer" containerID="4a1dee7d8a1e0fd0eafc37f984fd73038436cd0e3927c9bb3b7f5977ae6bc713" Oct 13 18:41:48 crc kubenswrapper[4974]: I1013 18:41:48.096716 4974 scope.go:117] "RemoveContainer" containerID="4bbff5f7998054525385d30f74d9873da184ca2359e0ec25cabd8776691a8783" Oct 13 18:41:48 crc kubenswrapper[4974]: I1013 18:41:48.121042 4974 scope.go:117] "RemoveContainer" containerID="d7ebd1a0be85eda5dbe7c4a76310869b51136d3da0f3bf5b6c47c53a33d23230" Oct 13 18:41:52 crc kubenswrapper[4974]: I1013 18:41:52.039691 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-l2v67"] Oct 13 18:41:52 crc kubenswrapper[4974]: I1013 18:41:52.049498 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-484wv"] Oct 13 18:41:52 crc kubenswrapper[4974]: I1013 18:41:52.059119 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-2q69p"] Oct 13 18:41:52 crc kubenswrapper[4974]: I1013 18:41:52.067676 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-l2v67"] Oct 13 18:41:52 crc kubenswrapper[4974]: I1013 18:41:52.087332 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-484wv"] Oct 13 18:41:52 crc kubenswrapper[4974]: I1013 18:41:52.100751 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-2q69p"] Oct 13 18:41:53 crc kubenswrapper[4974]: I1013 18:41:53.824803 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49fb758b-0291-4449-aa49-7e191cc1b2dc" path="/var/lib/kubelet/pods/49fb758b-0291-4449-aa49-7e191cc1b2dc/volumes" Oct 13 18:41:53 crc kubenswrapper[4974]: I1013 18:41:53.826391 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af98b1ee-3954-4c18-9656-f61280f56b95" path="/var/lib/kubelet/pods/af98b1ee-3954-4c18-9656-f61280f56b95/volumes" Oct 13 18:41:53 crc kubenswrapper[4974]: I1013 18:41:53.827582 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f29d4def-3da3-43a3-8331-f1ee4644dad2" path="/var/lib/kubelet/pods/f29d4def-3da3-43a3-8331-f1ee4644dad2/volumes" Oct 13 18:42:07 crc kubenswrapper[4974]: I1013 18:42:07.742786 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:42:07 crc kubenswrapper[4974]: I1013 18:42:07.743462 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:42:10 crc kubenswrapper[4974]: I1013 18:42:10.040955 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-njgsh"] Oct 13 18:42:10 crc kubenswrapper[4974]: I1013 18:42:10.054487 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-njgsh"] Oct 13 18:42:11 crc kubenswrapper[4974]: I1013 18:42:11.041875 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-jgthj"] Oct 13 18:42:11 crc kubenswrapper[4974]: I1013 18:42:11.061148 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-jgthj"] Oct 13 18:42:11 crc kubenswrapper[4974]: I1013 18:42:11.834534 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab455ee5-f8bd-4d4e-b179-524fa3edcc52" path="/var/lib/kubelet/pods/ab455ee5-f8bd-4d4e-b179-524fa3edcc52/volumes" Oct 13 18:42:11 crc kubenswrapper[4974]: I1013 18:42:11.836795 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3" path="/var/lib/kubelet/pods/bf98eaf2-9c3a-46b1-a1ae-4ed9a44f8bd3/volumes" Oct 13 18:42:14 crc kubenswrapper[4974]: I1013 18:42:14.973459 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4rlwd"] Oct 13 18:42:14 crc kubenswrapper[4974]: I1013 18:42:14.977188 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:14 crc kubenswrapper[4974]: I1013 18:42:14.993991 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rlwd"] Oct 13 18:42:15 crc kubenswrapper[4974]: I1013 18:42:15.062841 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0bf7134-18dc-43f4-8190-ba738c804b68-utilities\") pod \"redhat-marketplace-4rlwd\" (UID: \"d0bf7134-18dc-43f4-8190-ba738c804b68\") " pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:15 crc kubenswrapper[4974]: I1013 18:42:15.062919 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8hdk\" (UniqueName: \"kubernetes.io/projected/d0bf7134-18dc-43f4-8190-ba738c804b68-kube-api-access-p8hdk\") pod \"redhat-marketplace-4rlwd\" (UID: \"d0bf7134-18dc-43f4-8190-ba738c804b68\") " pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:15 crc kubenswrapper[4974]: I1013 18:42:15.062978 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0bf7134-18dc-43f4-8190-ba738c804b68-catalog-content\") pod \"redhat-marketplace-4rlwd\" (UID: \"d0bf7134-18dc-43f4-8190-ba738c804b68\") " pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:15 crc kubenswrapper[4974]: I1013 18:42:15.164625 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8hdk\" (UniqueName: \"kubernetes.io/projected/d0bf7134-18dc-43f4-8190-ba738c804b68-kube-api-access-p8hdk\") pod \"redhat-marketplace-4rlwd\" (UID: \"d0bf7134-18dc-43f4-8190-ba738c804b68\") " pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:15 crc kubenswrapper[4974]: I1013 18:42:15.164753 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0bf7134-18dc-43f4-8190-ba738c804b68-catalog-content\") pod \"redhat-marketplace-4rlwd\" (UID: \"d0bf7134-18dc-43f4-8190-ba738c804b68\") " pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:15 crc kubenswrapper[4974]: I1013 18:42:15.164964 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0bf7134-18dc-43f4-8190-ba738c804b68-utilities\") pod \"redhat-marketplace-4rlwd\" (UID: \"d0bf7134-18dc-43f4-8190-ba738c804b68\") " pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:15 crc kubenswrapper[4974]: I1013 18:42:15.165622 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0bf7134-18dc-43f4-8190-ba738c804b68-utilities\") pod \"redhat-marketplace-4rlwd\" (UID: \"d0bf7134-18dc-43f4-8190-ba738c804b68\") " pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:15 crc kubenswrapper[4974]: I1013 18:42:15.166415 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0bf7134-18dc-43f4-8190-ba738c804b68-catalog-content\") pod \"redhat-marketplace-4rlwd\" (UID: \"d0bf7134-18dc-43f4-8190-ba738c804b68\") " pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:15 crc kubenswrapper[4974]: I1013 18:42:15.191462 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8hdk\" (UniqueName: \"kubernetes.io/projected/d0bf7134-18dc-43f4-8190-ba738c804b68-kube-api-access-p8hdk\") pod \"redhat-marketplace-4rlwd\" (UID: \"d0bf7134-18dc-43f4-8190-ba738c804b68\") " pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:15 crc kubenswrapper[4974]: I1013 18:42:15.294168 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:15 crc kubenswrapper[4974]: I1013 18:42:15.765566 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rlwd"] Oct 13 18:42:15 crc kubenswrapper[4974]: I1013 18:42:15.800124 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rlwd" event={"ID":"d0bf7134-18dc-43f4-8190-ba738c804b68","Type":"ContainerStarted","Data":"55ff4e6b3fb2d88e81e8e29b5811fdc9c47829590e97ce8e95bc9b20cc6fb443"} Oct 13 18:42:16 crc kubenswrapper[4974]: I1013 18:42:16.813318 4974 generic.go:334] "Generic (PLEG): container finished" podID="d0bf7134-18dc-43f4-8190-ba738c804b68" containerID="f16165bf54774371b11ddbde6dfbecabd94754378df3d90643a02cfa5f958ce4" exitCode=0 Oct 13 18:42:16 crc kubenswrapper[4974]: I1013 18:42:16.813378 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rlwd" event={"ID":"d0bf7134-18dc-43f4-8190-ba738c804b68","Type":"ContainerDied","Data":"f16165bf54774371b11ddbde6dfbecabd94754378df3d90643a02cfa5f958ce4"} Oct 13 18:42:18 crc kubenswrapper[4974]: I1013 18:42:18.859902 4974 generic.go:334] "Generic (PLEG): container finished" podID="d0bf7134-18dc-43f4-8190-ba738c804b68" containerID="ba00a7c0f36b30cc6281b0eeac4e20ba7608834bf831e3b92526decbbde5b5c4" exitCode=0 Oct 13 18:42:18 crc kubenswrapper[4974]: I1013 18:42:18.859949 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rlwd" event={"ID":"d0bf7134-18dc-43f4-8190-ba738c804b68","Type":"ContainerDied","Data":"ba00a7c0f36b30cc6281b0eeac4e20ba7608834bf831e3b92526decbbde5b5c4"} Oct 13 18:42:19 crc kubenswrapper[4974]: I1013 18:42:19.875463 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rlwd" event={"ID":"d0bf7134-18dc-43f4-8190-ba738c804b68","Type":"ContainerStarted","Data":"9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c"} Oct 13 18:42:19 crc kubenswrapper[4974]: I1013 18:42:19.901883 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4rlwd" podStartSLOduration=3.272555133 podStartE2EDuration="5.901863164s" podCreationTimestamp="2025-10-13 18:42:14 +0000 UTC" firstStartedPulling="2025-10-13 18:42:16.816070433 +0000 UTC m=+1671.720436533" lastFinishedPulling="2025-10-13 18:42:19.445378444 +0000 UTC m=+1674.349744564" observedRunningTime="2025-10-13 18:42:19.891182752 +0000 UTC m=+1674.795548832" watchObservedRunningTime="2025-10-13 18:42:19.901863164 +0000 UTC m=+1674.806229254" Oct 13 18:42:25 crc kubenswrapper[4974]: I1013 18:42:25.295465 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:25 crc kubenswrapper[4974]: I1013 18:42:25.295878 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:25 crc kubenswrapper[4974]: I1013 18:42:25.348998 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:26 crc kubenswrapper[4974]: I1013 18:42:26.006249 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:26 crc kubenswrapper[4974]: I1013 18:42:26.080240 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rlwd"] Oct 13 18:42:27 crc kubenswrapper[4974]: I1013 18:42:27.978356 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4rlwd" podUID="d0bf7134-18dc-43f4-8190-ba738c804b68" containerName="registry-server" containerID="cri-o://9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c" gracePeriod=2 Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.471945 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.578788 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p8hdk\" (UniqueName: \"kubernetes.io/projected/d0bf7134-18dc-43f4-8190-ba738c804b68-kube-api-access-p8hdk\") pod \"d0bf7134-18dc-43f4-8190-ba738c804b68\" (UID: \"d0bf7134-18dc-43f4-8190-ba738c804b68\") " Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.579008 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0bf7134-18dc-43f4-8190-ba738c804b68-utilities\") pod \"d0bf7134-18dc-43f4-8190-ba738c804b68\" (UID: \"d0bf7134-18dc-43f4-8190-ba738c804b68\") " Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.579103 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0bf7134-18dc-43f4-8190-ba738c804b68-catalog-content\") pod \"d0bf7134-18dc-43f4-8190-ba738c804b68\" (UID: \"d0bf7134-18dc-43f4-8190-ba738c804b68\") " Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.581184 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0bf7134-18dc-43f4-8190-ba738c804b68-utilities" (OuterVolumeSpecName: "utilities") pod "d0bf7134-18dc-43f4-8190-ba738c804b68" (UID: "d0bf7134-18dc-43f4-8190-ba738c804b68"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.584989 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0bf7134-18dc-43f4-8190-ba738c804b68-kube-api-access-p8hdk" (OuterVolumeSpecName: "kube-api-access-p8hdk") pod "d0bf7134-18dc-43f4-8190-ba738c804b68" (UID: "d0bf7134-18dc-43f4-8190-ba738c804b68"). InnerVolumeSpecName "kube-api-access-p8hdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.603082 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0bf7134-18dc-43f4-8190-ba738c804b68-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0bf7134-18dc-43f4-8190-ba738c804b68" (UID: "d0bf7134-18dc-43f4-8190-ba738c804b68"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.682586 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0bf7134-18dc-43f4-8190-ba738c804b68-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.682627 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p8hdk\" (UniqueName: \"kubernetes.io/projected/d0bf7134-18dc-43f4-8190-ba738c804b68-kube-api-access-p8hdk\") on node \"crc\" DevicePath \"\"" Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.682645 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0bf7134-18dc-43f4-8190-ba738c804b68-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.996471 4974 generic.go:334] "Generic (PLEG): container finished" podID="d0bf7134-18dc-43f4-8190-ba738c804b68" containerID="9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c" exitCode=0 Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.996536 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rlwd" event={"ID":"d0bf7134-18dc-43f4-8190-ba738c804b68","Type":"ContainerDied","Data":"9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c"} Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.996567 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4rlwd" Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.996606 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4rlwd" event={"ID":"d0bf7134-18dc-43f4-8190-ba738c804b68","Type":"ContainerDied","Data":"55ff4e6b3fb2d88e81e8e29b5811fdc9c47829590e97ce8e95bc9b20cc6fb443"} Oct 13 18:42:28 crc kubenswrapper[4974]: I1013 18:42:28.996633 4974 scope.go:117] "RemoveContainer" containerID="9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c" Oct 13 18:42:29 crc kubenswrapper[4974]: I1013 18:42:29.036425 4974 scope.go:117] "RemoveContainer" containerID="ba00a7c0f36b30cc6281b0eeac4e20ba7608834bf831e3b92526decbbde5b5c4" Oct 13 18:42:29 crc kubenswrapper[4974]: I1013 18:42:29.058866 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rlwd"] Oct 13 18:42:29 crc kubenswrapper[4974]: I1013 18:42:29.073631 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4rlwd"] Oct 13 18:42:29 crc kubenswrapper[4974]: I1013 18:42:29.092904 4974 scope.go:117] "RemoveContainer" containerID="f16165bf54774371b11ddbde6dfbecabd94754378df3d90643a02cfa5f958ce4" Oct 13 18:42:29 crc kubenswrapper[4974]: I1013 18:42:29.127212 4974 scope.go:117] "RemoveContainer" containerID="9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c" Oct 13 18:42:29 crc kubenswrapper[4974]: E1013 18:42:29.129370 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c\": container with ID starting with 9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c not found: ID does not exist" containerID="9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c" Oct 13 18:42:29 crc kubenswrapper[4974]: I1013 18:42:29.129405 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c"} err="failed to get container status \"9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c\": rpc error: code = NotFound desc = could not find container \"9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c\": container with ID starting with 9652c9f549bfe188100c85ce56988f424c9eb029547cfae5d7065e3d56f7c92c not found: ID does not exist" Oct 13 18:42:29 crc kubenswrapper[4974]: I1013 18:42:29.129430 4974 scope.go:117] "RemoveContainer" containerID="ba00a7c0f36b30cc6281b0eeac4e20ba7608834bf831e3b92526decbbde5b5c4" Oct 13 18:42:29 crc kubenswrapper[4974]: E1013 18:42:29.129902 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba00a7c0f36b30cc6281b0eeac4e20ba7608834bf831e3b92526decbbde5b5c4\": container with ID starting with ba00a7c0f36b30cc6281b0eeac4e20ba7608834bf831e3b92526decbbde5b5c4 not found: ID does not exist" containerID="ba00a7c0f36b30cc6281b0eeac4e20ba7608834bf831e3b92526decbbde5b5c4" Oct 13 18:42:29 crc kubenswrapper[4974]: I1013 18:42:29.129929 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba00a7c0f36b30cc6281b0eeac4e20ba7608834bf831e3b92526decbbde5b5c4"} err="failed to get container status \"ba00a7c0f36b30cc6281b0eeac4e20ba7608834bf831e3b92526decbbde5b5c4\": rpc error: code = NotFound desc = could not find container \"ba00a7c0f36b30cc6281b0eeac4e20ba7608834bf831e3b92526decbbde5b5c4\": container with ID starting with ba00a7c0f36b30cc6281b0eeac4e20ba7608834bf831e3b92526decbbde5b5c4 not found: ID does not exist" Oct 13 18:42:29 crc kubenswrapper[4974]: I1013 18:42:29.129948 4974 scope.go:117] "RemoveContainer" containerID="f16165bf54774371b11ddbde6dfbecabd94754378df3d90643a02cfa5f958ce4" Oct 13 18:42:29 crc kubenswrapper[4974]: E1013 18:42:29.130350 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f16165bf54774371b11ddbde6dfbecabd94754378df3d90643a02cfa5f958ce4\": container with ID starting with f16165bf54774371b11ddbde6dfbecabd94754378df3d90643a02cfa5f958ce4 not found: ID does not exist" containerID="f16165bf54774371b11ddbde6dfbecabd94754378df3d90643a02cfa5f958ce4" Oct 13 18:42:29 crc kubenswrapper[4974]: I1013 18:42:29.130374 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f16165bf54774371b11ddbde6dfbecabd94754378df3d90643a02cfa5f958ce4"} err="failed to get container status \"f16165bf54774371b11ddbde6dfbecabd94754378df3d90643a02cfa5f958ce4\": rpc error: code = NotFound desc = could not find container \"f16165bf54774371b11ddbde6dfbecabd94754378df3d90643a02cfa5f958ce4\": container with ID starting with f16165bf54774371b11ddbde6dfbecabd94754378df3d90643a02cfa5f958ce4 not found: ID does not exist" Oct 13 18:42:29 crc kubenswrapper[4974]: I1013 18:42:29.830117 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0bf7134-18dc-43f4-8190-ba738c804b68" path="/var/lib/kubelet/pods/d0bf7134-18dc-43f4-8190-ba738c804b68/volumes" Oct 13 18:42:37 crc kubenswrapper[4974]: I1013 18:42:37.743218 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:42:37 crc kubenswrapper[4974]: I1013 18:42:37.743913 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:42:37 crc kubenswrapper[4974]: I1013 18:42:37.743968 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:42:37 crc kubenswrapper[4974]: I1013 18:42:37.744815 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 18:42:37 crc kubenswrapper[4974]: I1013 18:42:37.744869 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" gracePeriod=600 Oct 13 18:42:37 crc kubenswrapper[4974]: E1013 18:42:37.884003 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:42:38 crc kubenswrapper[4974]: I1013 18:42:38.116189 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" exitCode=0 Oct 13 18:42:38 crc kubenswrapper[4974]: I1013 18:42:38.116270 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2"} Oct 13 18:42:38 crc kubenswrapper[4974]: I1013 18:42:38.116509 4974 scope.go:117] "RemoveContainer" containerID="ea39e73c9d324740300df4165ffa4e49cea26ef15903575dd3e89904e7bd3e53" Oct 13 18:42:38 crc kubenswrapper[4974]: I1013 18:42:38.118245 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:42:38 crc kubenswrapper[4974]: E1013 18:42:38.118545 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:42:48 crc kubenswrapper[4974]: I1013 18:42:48.364277 4974 scope.go:117] "RemoveContainer" containerID="ff5dec3b18644db01f42beb495cb595475713647a9fa6ae84f9caa9404dd6b33" Oct 13 18:42:48 crc kubenswrapper[4974]: I1013 18:42:48.411934 4974 scope.go:117] "RemoveContainer" containerID="010bc063ed964ef76d8cfee518d76c71e443363c2c2521f1ecac16bb15de3e8a" Oct 13 18:42:48 crc kubenswrapper[4974]: I1013 18:42:48.467582 4974 scope.go:117] "RemoveContainer" containerID="eef82df6ce6cd9ca241efc80766eea2428b0a3c27c2c94a2478846595e58ac9a" Oct 13 18:42:48 crc kubenswrapper[4974]: I1013 18:42:48.517693 4974 scope.go:117] "RemoveContainer" containerID="dee013a4e07ee8ac5a0f2cd484903875bfd6fcb36664a869168c4b0a3a09dc9e" Oct 13 18:42:48 crc kubenswrapper[4974]: I1013 18:42:48.582133 4974 scope.go:117] "RemoveContainer" containerID="eb645210fcad7a793e788506a5f516ba05c4c3d70cb3589162f560823eeb8cfb" Oct 13 18:42:51 crc kubenswrapper[4974]: I1013 18:42:51.811485 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:42:51 crc kubenswrapper[4974]: E1013 18:42:51.812093 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:42:54 crc kubenswrapper[4974]: I1013 18:42:54.065170 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-65hg2"] Oct 13 18:42:54 crc kubenswrapper[4974]: I1013 18:42:54.089274 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-j6b9d"] Oct 13 18:42:54 crc kubenswrapper[4974]: I1013 18:42:54.097017 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-wqqtz"] Oct 13 18:42:54 crc kubenswrapper[4974]: I1013 18:42:54.105199 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-wqqtz"] Oct 13 18:42:54 crc kubenswrapper[4974]: I1013 18:42:54.112632 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-j6b9d"] Oct 13 18:42:54 crc kubenswrapper[4974]: I1013 18:42:54.119992 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-65hg2"] Oct 13 18:42:55 crc kubenswrapper[4974]: I1013 18:42:55.826091 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27f6985f-10d9-40de-8441-8289ed83515c" path="/var/lib/kubelet/pods/27f6985f-10d9-40de-8441-8289ed83515c/volumes" Oct 13 18:42:55 crc kubenswrapper[4974]: I1013 18:42:55.827176 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2faa46ff-30c6-4bc7-9b04-69088774b56d" path="/var/lib/kubelet/pods/2faa46ff-30c6-4bc7-9b04-69088774b56d/volumes" Oct 13 18:42:55 crc kubenswrapper[4974]: I1013 18:42:55.827874 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a86a831-6077-4935-b8ce-f48755ebe658" path="/var/lib/kubelet/pods/8a86a831-6077-4935-b8ce-f48755ebe658/volumes" Oct 13 18:42:59 crc kubenswrapper[4974]: I1013 18:42:59.388788 4974 generic.go:334] "Generic (PLEG): container finished" podID="fedc6dd9-1f4c-43f4-9e0b-74292be529a6" containerID="6ceff627d9212f8a0d8a69c7c149787e389196fcce953093acf8d02d2311f9d0" exitCode=0 Oct 13 18:42:59 crc kubenswrapper[4974]: I1013 18:42:59.388919 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" event={"ID":"fedc6dd9-1f4c-43f4-9e0b-74292be529a6","Type":"ContainerDied","Data":"6ceff627d9212f8a0d8a69c7c149787e389196fcce953093acf8d02d2311f9d0"} Oct 13 18:43:00 crc kubenswrapper[4974]: I1013 18:43:00.947809 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.065711 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-ssh-key\") pod \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\" (UID: \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\") " Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.065877 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r86ql\" (UniqueName: \"kubernetes.io/projected/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-kube-api-access-r86ql\") pod \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\" (UID: \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\") " Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.065983 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-inventory\") pod \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\" (UID: \"fedc6dd9-1f4c-43f4-9e0b-74292be529a6\") " Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.075348 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-kube-api-access-r86ql" (OuterVolumeSpecName: "kube-api-access-r86ql") pod "fedc6dd9-1f4c-43f4-9e0b-74292be529a6" (UID: "fedc6dd9-1f4c-43f4-9e0b-74292be529a6"). InnerVolumeSpecName "kube-api-access-r86ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.118914 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fedc6dd9-1f4c-43f4-9e0b-74292be529a6" (UID: "fedc6dd9-1f4c-43f4-9e0b-74292be529a6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.123252 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-inventory" (OuterVolumeSpecName: "inventory") pod "fedc6dd9-1f4c-43f4-9e0b-74292be529a6" (UID: "fedc6dd9-1f4c-43f4-9e0b-74292be529a6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.168886 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.168930 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.168944 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r86ql\" (UniqueName: \"kubernetes.io/projected/fedc6dd9-1f4c-43f4-9e0b-74292be529a6-kube-api-access-r86ql\") on node \"crc\" DevicePath \"\"" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.414558 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" event={"ID":"fedc6dd9-1f4c-43f4-9e0b-74292be529a6","Type":"ContainerDied","Data":"4c2f2b3a0554eb6f8c66ee6b0c82e83be7f022d10134975166b29fc2aed4b37f"} Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.414605 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c2f2b3a0554eb6f8c66ee6b0c82e83be7f022d10134975166b29fc2aed4b37f" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.414693 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.508959 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t"] Oct 13 18:43:01 crc kubenswrapper[4974]: E1013 18:43:01.509446 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fedc6dd9-1f4c-43f4-9e0b-74292be529a6" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.509470 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fedc6dd9-1f4c-43f4-9e0b-74292be529a6" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 13 18:43:01 crc kubenswrapper[4974]: E1013 18:43:01.509501 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bf7134-18dc-43f4-8190-ba738c804b68" containerName="registry-server" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.509510 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bf7134-18dc-43f4-8190-ba738c804b68" containerName="registry-server" Oct 13 18:43:01 crc kubenswrapper[4974]: E1013 18:43:01.509530 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bf7134-18dc-43f4-8190-ba738c804b68" containerName="extract-content" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.509538 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bf7134-18dc-43f4-8190-ba738c804b68" containerName="extract-content" Oct 13 18:43:01 crc kubenswrapper[4974]: E1013 18:43:01.509554 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bf7134-18dc-43f4-8190-ba738c804b68" containerName="extract-utilities" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.509562 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bf7134-18dc-43f4-8190-ba738c804b68" containerName="extract-utilities" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.509839 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0bf7134-18dc-43f4-8190-ba738c804b68" containerName="registry-server" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.509867 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="fedc6dd9-1f4c-43f4-9e0b-74292be529a6" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.511265 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.517321 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.517400 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.518406 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.521633 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t"] Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.522154 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.677333 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2eee5ad-fe26-46b2-af3c-1477c1513609-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t\" (UID: \"f2eee5ad-fe26-46b2-af3c-1477c1513609\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.677508 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2eee5ad-fe26-46b2-af3c-1477c1513609-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t\" (UID: \"f2eee5ad-fe26-46b2-af3c-1477c1513609\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.677714 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p7wv\" (UniqueName: \"kubernetes.io/projected/f2eee5ad-fe26-46b2-af3c-1477c1513609-kube-api-access-8p7wv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t\" (UID: \"f2eee5ad-fe26-46b2-af3c-1477c1513609\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.780901 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2eee5ad-fe26-46b2-af3c-1477c1513609-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t\" (UID: \"f2eee5ad-fe26-46b2-af3c-1477c1513609\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.780999 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2eee5ad-fe26-46b2-af3c-1477c1513609-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t\" (UID: \"f2eee5ad-fe26-46b2-af3c-1477c1513609\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.781065 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p7wv\" (UniqueName: \"kubernetes.io/projected/f2eee5ad-fe26-46b2-af3c-1477c1513609-kube-api-access-8p7wv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t\" (UID: \"f2eee5ad-fe26-46b2-af3c-1477c1513609\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.794542 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2eee5ad-fe26-46b2-af3c-1477c1513609-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t\" (UID: \"f2eee5ad-fe26-46b2-af3c-1477c1513609\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.803394 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2eee5ad-fe26-46b2-af3c-1477c1513609-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t\" (UID: \"f2eee5ad-fe26-46b2-af3c-1477c1513609\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.814953 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p7wv\" (UniqueName: \"kubernetes.io/projected/f2eee5ad-fe26-46b2-af3c-1477c1513609-kube-api-access-8p7wv\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t\" (UID: \"f2eee5ad-fe26-46b2-af3c-1477c1513609\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:01 crc kubenswrapper[4974]: I1013 18:43:01.846391 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:02 crc kubenswrapper[4974]: I1013 18:43:02.435168 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t"] Oct 13 18:43:03 crc kubenswrapper[4974]: I1013 18:43:03.438613 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" event={"ID":"f2eee5ad-fe26-46b2-af3c-1477c1513609","Type":"ContainerStarted","Data":"0bae576fb90158e4caa7a817b5adbe42751b8f131ecdf4c4c7cad8916a0d1204"} Oct 13 18:43:03 crc kubenswrapper[4974]: I1013 18:43:03.438943 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" event={"ID":"f2eee5ad-fe26-46b2-af3c-1477c1513609","Type":"ContainerStarted","Data":"45d8bb30c7c7a7fe0980e662d9f97131ec351433a847e74514dcf155a03d7389"} Oct 13 18:43:03 crc kubenswrapper[4974]: I1013 18:43:03.463683 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" podStartSLOduration=2.008360176 podStartE2EDuration="2.463639692s" podCreationTimestamp="2025-10-13 18:43:01 +0000 UTC" firstStartedPulling="2025-10-13 18:43:02.444159044 +0000 UTC m=+1717.348525124" lastFinishedPulling="2025-10-13 18:43:02.89943855 +0000 UTC m=+1717.803804640" observedRunningTime="2025-10-13 18:43:03.45612394 +0000 UTC m=+1718.360490020" watchObservedRunningTime="2025-10-13 18:43:03.463639692 +0000 UTC m=+1718.368005782" Oct 13 18:43:04 crc kubenswrapper[4974]: I1013 18:43:04.032460 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e1d3-account-create-jzpp5"] Oct 13 18:43:04 crc kubenswrapper[4974]: I1013 18:43:04.044812 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1ad5-account-create-9j2hm"] Oct 13 18:43:04 crc kubenswrapper[4974]: I1013 18:43:04.055453 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-30c2-account-create-grnv8"] Oct 13 18:43:04 crc kubenswrapper[4974]: I1013 18:43:04.064604 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-30c2-account-create-grnv8"] Oct 13 18:43:04 crc kubenswrapper[4974]: I1013 18:43:04.071555 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1ad5-account-create-9j2hm"] Oct 13 18:43:04 crc kubenswrapper[4974]: I1013 18:43:04.078929 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e1d3-account-create-jzpp5"] Oct 13 18:43:04 crc kubenswrapper[4974]: I1013 18:43:04.812465 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:43:04 crc kubenswrapper[4974]: E1013 18:43:04.812762 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:43:05 crc kubenswrapper[4974]: I1013 18:43:05.835770 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b287482e-c536-4a68-8c64-3e8fbfbc8c4c" path="/var/lib/kubelet/pods/b287482e-c536-4a68-8c64-3e8fbfbc8c4c/volumes" Oct 13 18:43:05 crc kubenswrapper[4974]: I1013 18:43:05.837982 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7" path="/var/lib/kubelet/pods/c7bc2aa8-96b5-412b-9dd8-2263e8a8e5a7/volumes" Oct 13 18:43:05 crc kubenswrapper[4974]: I1013 18:43:05.839497 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a7c093-f85f-4362-a73f-fea72dd2833b" path="/var/lib/kubelet/pods/e8a7c093-f85f-4362-a73f-fea72dd2833b/volumes" Oct 13 18:43:08 crc kubenswrapper[4974]: I1013 18:43:08.512166 4974 generic.go:334] "Generic (PLEG): container finished" podID="f2eee5ad-fe26-46b2-af3c-1477c1513609" containerID="0bae576fb90158e4caa7a817b5adbe42751b8f131ecdf4c4c7cad8916a0d1204" exitCode=0 Oct 13 18:43:08 crc kubenswrapper[4974]: I1013 18:43:08.512286 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" event={"ID":"f2eee5ad-fe26-46b2-af3c-1477c1513609","Type":"ContainerDied","Data":"0bae576fb90158e4caa7a817b5adbe42751b8f131ecdf4c4c7cad8916a0d1204"} Oct 13 18:43:09 crc kubenswrapper[4974]: I1013 18:43:09.986814 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.093615 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2eee5ad-fe26-46b2-af3c-1477c1513609-ssh-key\") pod \"f2eee5ad-fe26-46b2-af3c-1477c1513609\" (UID: \"f2eee5ad-fe26-46b2-af3c-1477c1513609\") " Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.093860 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p7wv\" (UniqueName: \"kubernetes.io/projected/f2eee5ad-fe26-46b2-af3c-1477c1513609-kube-api-access-8p7wv\") pod \"f2eee5ad-fe26-46b2-af3c-1477c1513609\" (UID: \"f2eee5ad-fe26-46b2-af3c-1477c1513609\") " Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.093946 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2eee5ad-fe26-46b2-af3c-1477c1513609-inventory\") pod \"f2eee5ad-fe26-46b2-af3c-1477c1513609\" (UID: \"f2eee5ad-fe26-46b2-af3c-1477c1513609\") " Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.099350 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2eee5ad-fe26-46b2-af3c-1477c1513609-kube-api-access-8p7wv" (OuterVolumeSpecName: "kube-api-access-8p7wv") pod "f2eee5ad-fe26-46b2-af3c-1477c1513609" (UID: "f2eee5ad-fe26-46b2-af3c-1477c1513609"). InnerVolumeSpecName "kube-api-access-8p7wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.129487 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2eee5ad-fe26-46b2-af3c-1477c1513609-inventory" (OuterVolumeSpecName: "inventory") pod "f2eee5ad-fe26-46b2-af3c-1477c1513609" (UID: "f2eee5ad-fe26-46b2-af3c-1477c1513609"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.143024 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2eee5ad-fe26-46b2-af3c-1477c1513609-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f2eee5ad-fe26-46b2-af3c-1477c1513609" (UID: "f2eee5ad-fe26-46b2-af3c-1477c1513609"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.197895 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f2eee5ad-fe26-46b2-af3c-1477c1513609-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.198232 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p7wv\" (UniqueName: \"kubernetes.io/projected/f2eee5ad-fe26-46b2-af3c-1477c1513609-kube-api-access-8p7wv\") on node \"crc\" DevicePath \"\"" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.198457 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2eee5ad-fe26-46b2-af3c-1477c1513609-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.540612 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" event={"ID":"f2eee5ad-fe26-46b2-af3c-1477c1513609","Type":"ContainerDied","Data":"45d8bb30c7c7a7fe0980e662d9f97131ec351433a847e74514dcf155a03d7389"} Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.540700 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45d8bb30c7c7a7fe0980e662d9f97131ec351433a847e74514dcf155a03d7389" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.541164 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.632253 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7"] Oct 13 18:43:10 crc kubenswrapper[4974]: E1013 18:43:10.632715 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2eee5ad-fe26-46b2-af3c-1477c1513609" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.632738 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2eee5ad-fe26-46b2-af3c-1477c1513609" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.632942 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2eee5ad-fe26-46b2-af3c-1477c1513609" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.633675 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.642361 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.643572 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.643873 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.644544 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.656141 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7"] Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.812046 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e2087d1-027f-4fc7-8a75-5421f0e55868-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwhx7\" (UID: \"7e2087d1-027f-4fc7-8a75-5421f0e55868\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.812390 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e2087d1-027f-4fc7-8a75-5421f0e55868-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwhx7\" (UID: \"7e2087d1-027f-4fc7-8a75-5421f0e55868\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.812600 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghp2g\" (UniqueName: \"kubernetes.io/projected/7e2087d1-027f-4fc7-8a75-5421f0e55868-kube-api-access-ghp2g\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwhx7\" (UID: \"7e2087d1-027f-4fc7-8a75-5421f0e55868\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.914036 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghp2g\" (UniqueName: \"kubernetes.io/projected/7e2087d1-027f-4fc7-8a75-5421f0e55868-kube-api-access-ghp2g\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwhx7\" (UID: \"7e2087d1-027f-4fc7-8a75-5421f0e55868\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.914221 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e2087d1-027f-4fc7-8a75-5421f0e55868-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwhx7\" (UID: \"7e2087d1-027f-4fc7-8a75-5421f0e55868\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.914252 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e2087d1-027f-4fc7-8a75-5421f0e55868-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwhx7\" (UID: \"7e2087d1-027f-4fc7-8a75-5421f0e55868\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.920056 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e2087d1-027f-4fc7-8a75-5421f0e55868-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwhx7\" (UID: \"7e2087d1-027f-4fc7-8a75-5421f0e55868\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.925381 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e2087d1-027f-4fc7-8a75-5421f0e55868-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwhx7\" (UID: \"7e2087d1-027f-4fc7-8a75-5421f0e55868\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.945857 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghp2g\" (UniqueName: \"kubernetes.io/projected/7e2087d1-027f-4fc7-8a75-5421f0e55868-kube-api-access-ghp2g\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mwhx7\" (UID: \"7e2087d1-027f-4fc7-8a75-5421f0e55868\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:10 crc kubenswrapper[4974]: I1013 18:43:10.957518 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:11 crc kubenswrapper[4974]: I1013 18:43:11.538869 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7"] Oct 13 18:43:11 crc kubenswrapper[4974]: W1013 18:43:11.548727 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e2087d1_027f_4fc7_8a75_5421f0e55868.slice/crio-e17ebd3e626c8f31317219cfb47ab57ac9b46319452686e881814bbc67c6092d WatchSource:0}: Error finding container e17ebd3e626c8f31317219cfb47ab57ac9b46319452686e881814bbc67c6092d: Status 404 returned error can't find the container with id e17ebd3e626c8f31317219cfb47ab57ac9b46319452686e881814bbc67c6092d Oct 13 18:43:12 crc kubenswrapper[4974]: I1013 18:43:12.577115 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" event={"ID":"7e2087d1-027f-4fc7-8a75-5421f0e55868","Type":"ContainerStarted","Data":"a80f42fd4bae304bbf9a8ebc23c392ee8f0035687a40c83e7dc2541ba3fcecd1"} Oct 13 18:43:12 crc kubenswrapper[4974]: I1013 18:43:12.577508 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" event={"ID":"7e2087d1-027f-4fc7-8a75-5421f0e55868","Type":"ContainerStarted","Data":"e17ebd3e626c8f31317219cfb47ab57ac9b46319452686e881814bbc67c6092d"} Oct 13 18:43:12 crc kubenswrapper[4974]: I1013 18:43:12.611830 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" podStartSLOduration=2.027212603 podStartE2EDuration="2.611805272s" podCreationTimestamp="2025-10-13 18:43:10 +0000 UTC" firstStartedPulling="2025-10-13 18:43:11.553150806 +0000 UTC m=+1726.457516896" lastFinishedPulling="2025-10-13 18:43:12.137743445 +0000 UTC m=+1727.042109565" observedRunningTime="2025-10-13 18:43:12.603625821 +0000 UTC m=+1727.507991951" watchObservedRunningTime="2025-10-13 18:43:12.611805272 +0000 UTC m=+1727.516171392" Oct 13 18:43:16 crc kubenswrapper[4974]: I1013 18:43:16.811983 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:43:16 crc kubenswrapper[4974]: E1013 18:43:16.813273 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:43:29 crc kubenswrapper[4974]: I1013 18:43:29.050134 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sk544"] Oct 13 18:43:29 crc kubenswrapper[4974]: I1013 18:43:29.059604 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-sk544"] Oct 13 18:43:29 crc kubenswrapper[4974]: I1013 18:43:29.827142 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7d74baa-5dd0-454d-af5d-474c29d83d21" path="/var/lib/kubelet/pods/b7d74baa-5dd0-454d-af5d-474c29d83d21/volumes" Oct 13 18:43:31 crc kubenswrapper[4974]: I1013 18:43:31.812289 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:43:31 crc kubenswrapper[4974]: E1013 18:43:31.812534 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:43:42 crc kubenswrapper[4974]: I1013 18:43:42.811786 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:43:42 crc kubenswrapper[4974]: E1013 18:43:42.812721 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:43:48 crc kubenswrapper[4974]: I1013 18:43:48.765128 4974 scope.go:117] "RemoveContainer" containerID="a19a145f311b1b683a2de53c61ce02eaf50556bf4cba22a2ba3092fd81bf65e6" Oct 13 18:43:48 crc kubenswrapper[4974]: I1013 18:43:48.802175 4974 scope.go:117] "RemoveContainer" containerID="cb92217511c837964b294f01447b13c96c428d64da142805b454fdb92a2264f6" Oct 13 18:43:48 crc kubenswrapper[4974]: I1013 18:43:48.881389 4974 scope.go:117] "RemoveContainer" containerID="4bcdf0aa9d90318d4969b3813dd38297746fbb096340728f8eac758dfac0041b" Oct 13 18:43:48 crc kubenswrapper[4974]: I1013 18:43:48.947053 4974 scope.go:117] "RemoveContainer" containerID="3e147a6637689712582d2edf869325765299ddbf20cbf452eb204f6b399e2503" Oct 13 18:43:49 crc kubenswrapper[4974]: I1013 18:43:49.006052 4974 scope.go:117] "RemoveContainer" containerID="3a154b3aab623f21b5fffe856a6752def8ce1bdb6b4174d40014dc50108ef0da" Oct 13 18:43:49 crc kubenswrapper[4974]: I1013 18:43:49.048490 4974 scope.go:117] "RemoveContainer" containerID="eebfad1b1cce0c4fbbd799ff15fad8e6d401b0a58a14d6064ffa656f94b4b983" Oct 13 18:43:49 crc kubenswrapper[4974]: I1013 18:43:49.096678 4974 scope.go:117] "RemoveContainer" containerID="d5371d8e229af4bda4cdf80ec9b0eeea6e560bb60839e1b1a87980753f9b7616" Oct 13 18:43:52 crc kubenswrapper[4974]: I1013 18:43:52.043105 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-5q4sp"] Oct 13 18:43:52 crc kubenswrapper[4974]: I1013 18:43:52.053542 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-5q4sp"] Oct 13 18:43:53 crc kubenswrapper[4974]: I1013 18:43:53.813424 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:43:53 crc kubenswrapper[4974]: E1013 18:43:53.814119 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:43:53 crc kubenswrapper[4974]: I1013 18:43:53.826879 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f00ec4d-7131-4fbd-9077-53b53ea0abc1" path="/var/lib/kubelet/pods/2f00ec4d-7131-4fbd-9077-53b53ea0abc1/volumes" Oct 13 18:43:55 crc kubenswrapper[4974]: I1013 18:43:55.049971 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w4zh2"] Oct 13 18:43:55 crc kubenswrapper[4974]: I1013 18:43:55.061999 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-w4zh2"] Oct 13 18:43:55 crc kubenswrapper[4974]: I1013 18:43:55.110986 4974 generic.go:334] "Generic (PLEG): container finished" podID="7e2087d1-027f-4fc7-8a75-5421f0e55868" containerID="a80f42fd4bae304bbf9a8ebc23c392ee8f0035687a40c83e7dc2541ba3fcecd1" exitCode=0 Oct 13 18:43:55 crc kubenswrapper[4974]: I1013 18:43:55.111049 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" event={"ID":"7e2087d1-027f-4fc7-8a75-5421f0e55868","Type":"ContainerDied","Data":"a80f42fd4bae304bbf9a8ebc23c392ee8f0035687a40c83e7dc2541ba3fcecd1"} Oct 13 18:43:55 crc kubenswrapper[4974]: I1013 18:43:55.836626 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3edde18-d8b7-4d74-a226-2078fb905c13" path="/var/lib/kubelet/pods/e3edde18-d8b7-4d74-a226-2078fb905c13/volumes" Oct 13 18:43:56 crc kubenswrapper[4974]: I1013 18:43:56.510551 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:56 crc kubenswrapper[4974]: I1013 18:43:56.633776 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghp2g\" (UniqueName: \"kubernetes.io/projected/7e2087d1-027f-4fc7-8a75-5421f0e55868-kube-api-access-ghp2g\") pod \"7e2087d1-027f-4fc7-8a75-5421f0e55868\" (UID: \"7e2087d1-027f-4fc7-8a75-5421f0e55868\") " Oct 13 18:43:56 crc kubenswrapper[4974]: I1013 18:43:56.634168 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e2087d1-027f-4fc7-8a75-5421f0e55868-ssh-key\") pod \"7e2087d1-027f-4fc7-8a75-5421f0e55868\" (UID: \"7e2087d1-027f-4fc7-8a75-5421f0e55868\") " Oct 13 18:43:56 crc kubenswrapper[4974]: I1013 18:43:56.634197 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e2087d1-027f-4fc7-8a75-5421f0e55868-inventory\") pod \"7e2087d1-027f-4fc7-8a75-5421f0e55868\" (UID: \"7e2087d1-027f-4fc7-8a75-5421f0e55868\") " Oct 13 18:43:56 crc kubenswrapper[4974]: I1013 18:43:56.639766 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e2087d1-027f-4fc7-8a75-5421f0e55868-kube-api-access-ghp2g" (OuterVolumeSpecName: "kube-api-access-ghp2g") pod "7e2087d1-027f-4fc7-8a75-5421f0e55868" (UID: "7e2087d1-027f-4fc7-8a75-5421f0e55868"). InnerVolumeSpecName "kube-api-access-ghp2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:43:56 crc kubenswrapper[4974]: I1013 18:43:56.660997 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2087d1-027f-4fc7-8a75-5421f0e55868-inventory" (OuterVolumeSpecName: "inventory") pod "7e2087d1-027f-4fc7-8a75-5421f0e55868" (UID: "7e2087d1-027f-4fc7-8a75-5421f0e55868"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:43:56 crc kubenswrapper[4974]: I1013 18:43:56.685804 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e2087d1-027f-4fc7-8a75-5421f0e55868-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7e2087d1-027f-4fc7-8a75-5421f0e55868" (UID: "7e2087d1-027f-4fc7-8a75-5421f0e55868"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:43:56 crc kubenswrapper[4974]: I1013 18:43:56.736815 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghp2g\" (UniqueName: \"kubernetes.io/projected/7e2087d1-027f-4fc7-8a75-5421f0e55868-kube-api-access-ghp2g\") on node \"crc\" DevicePath \"\"" Oct 13 18:43:56 crc kubenswrapper[4974]: I1013 18:43:56.736849 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7e2087d1-027f-4fc7-8a75-5421f0e55868-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:43:56 crc kubenswrapper[4974]: I1013 18:43:56.736861 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7e2087d1-027f-4fc7-8a75-5421f0e55868-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.128894 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" event={"ID":"7e2087d1-027f-4fc7-8a75-5421f0e55868","Type":"ContainerDied","Data":"e17ebd3e626c8f31317219cfb47ab57ac9b46319452686e881814bbc67c6092d"} Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.128936 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e17ebd3e626c8f31317219cfb47ab57ac9b46319452686e881814bbc67c6092d" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.128938 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mwhx7" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.238278 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh"] Oct 13 18:43:57 crc kubenswrapper[4974]: E1013 18:43:57.239327 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e2087d1-027f-4fc7-8a75-5421f0e55868" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.239354 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e2087d1-027f-4fc7-8a75-5421f0e55868" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.239744 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e2087d1-027f-4fc7-8a75-5421f0e55868" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.240748 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.242770 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.242880 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.243309 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.244321 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.262853 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh"] Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.367058 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh\" (UID: \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.367411 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh\" (UID: \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.367673 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjp4f\" (UniqueName: \"kubernetes.io/projected/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-kube-api-access-qjp4f\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh\" (UID: \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.469812 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjp4f\" (UniqueName: \"kubernetes.io/projected/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-kube-api-access-qjp4f\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh\" (UID: \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.469893 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh\" (UID: \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.469938 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh\" (UID: \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.473649 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh\" (UID: \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.473983 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh\" (UID: \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.498206 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjp4f\" (UniqueName: \"kubernetes.io/projected/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-kube-api-access-qjp4f\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh\" (UID: \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:43:57 crc kubenswrapper[4974]: I1013 18:43:57.562861 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:43:58 crc kubenswrapper[4974]: I1013 18:43:58.131369 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh"] Oct 13 18:43:58 crc kubenswrapper[4974]: W1013 18:43:58.176005 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf61911c_89bc_4e8c_a327_6c1bab3c7d5d.slice/crio-cb37382a7946acf3cac004fd63df30c43f7a97e5d78b92206567496a10877c99 WatchSource:0}: Error finding container cb37382a7946acf3cac004fd63df30c43f7a97e5d78b92206567496a10877c99: Status 404 returned error can't find the container with id cb37382a7946acf3cac004fd63df30c43f7a97e5d78b92206567496a10877c99 Oct 13 18:43:58 crc kubenswrapper[4974]: I1013 18:43:58.179477 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 18:43:59 crc kubenswrapper[4974]: I1013 18:43:59.180723 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" event={"ID":"af61911c-89bc-4e8c-a327-6c1bab3c7d5d","Type":"ContainerStarted","Data":"99d3c6a65847595b17e92fdc3392ff8d4f3fdd3011c4c5cd64a70c068845afc3"} Oct 13 18:43:59 crc kubenswrapper[4974]: I1013 18:43:59.181920 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" event={"ID":"af61911c-89bc-4e8c-a327-6c1bab3c7d5d","Type":"ContainerStarted","Data":"cb37382a7946acf3cac004fd63df30c43f7a97e5d78b92206567496a10877c99"} Oct 13 18:43:59 crc kubenswrapper[4974]: I1013 18:43:59.215716 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" podStartSLOduration=1.634378199 podStartE2EDuration="2.215631848s" podCreationTimestamp="2025-10-13 18:43:57 +0000 UTC" firstStartedPulling="2025-10-13 18:43:58.179198739 +0000 UTC m=+1773.083564839" lastFinishedPulling="2025-10-13 18:43:58.760452368 +0000 UTC m=+1773.664818488" observedRunningTime="2025-10-13 18:43:59.201225541 +0000 UTC m=+1774.105591631" watchObservedRunningTime="2025-10-13 18:43:59.215631848 +0000 UTC m=+1774.119997938" Oct 13 18:44:08 crc kubenswrapper[4974]: I1013 18:44:08.813195 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:44:08 crc kubenswrapper[4974]: E1013 18:44:08.838683 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:44:23 crc kubenswrapper[4974]: I1013 18:44:23.811902 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:44:23 crc kubenswrapper[4974]: E1013 18:44:23.812654 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:44:35 crc kubenswrapper[4974]: I1013 18:44:35.817610 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:44:35 crc kubenswrapper[4974]: E1013 18:44:35.818888 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:44:38 crc kubenswrapper[4974]: I1013 18:44:38.037201 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-gvkwd"] Oct 13 18:44:38 crc kubenswrapper[4974]: I1013 18:44:38.044174 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-gvkwd"] Oct 13 18:44:39 crc kubenswrapper[4974]: I1013 18:44:39.825822 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02183b03-008b-42e4-89f7-7b1186eada64" path="/var/lib/kubelet/pods/02183b03-008b-42e4-89f7-7b1186eada64/volumes" Oct 13 18:44:49 crc kubenswrapper[4974]: I1013 18:44:49.263105 4974 scope.go:117] "RemoveContainer" containerID="cca5ce5d3932d896ce962b65b92ad60d8c7b98ae647438b2495d5c5deb0372a2" Oct 13 18:44:49 crc kubenswrapper[4974]: I1013 18:44:49.332540 4974 scope.go:117] "RemoveContainer" containerID="c54198f199657ad749421ec9dd41c988fc735d4998cf1446050d644c4397ef0e" Oct 13 18:44:49 crc kubenswrapper[4974]: I1013 18:44:49.413021 4974 scope.go:117] "RemoveContainer" containerID="bd40b0f1661d69f3d7d8721979684d97ed639a7713fb21ba07d3f425d47ffd53" Oct 13 18:44:49 crc kubenswrapper[4974]: I1013 18:44:49.811455 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:44:49 crc kubenswrapper[4974]: E1013 18:44:49.811822 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.150619 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8"] Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.152884 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.156759 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.157351 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.162111 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8"] Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.239525 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfxh9\" (UniqueName: \"kubernetes.io/projected/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-kube-api-access-qfxh9\") pod \"collect-profiles-29339685-dqhp8\" (UID: \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.239684 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-config-volume\") pod \"collect-profiles-29339685-dqhp8\" (UID: \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.239734 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-secret-volume\") pod \"collect-profiles-29339685-dqhp8\" (UID: \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.340560 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfxh9\" (UniqueName: \"kubernetes.io/projected/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-kube-api-access-qfxh9\") pod \"collect-profiles-29339685-dqhp8\" (UID: \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.340745 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-config-volume\") pod \"collect-profiles-29339685-dqhp8\" (UID: \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.340788 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-secret-volume\") pod \"collect-profiles-29339685-dqhp8\" (UID: \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.342636 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-config-volume\") pod \"collect-profiles-29339685-dqhp8\" (UID: \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.350313 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-secret-volume\") pod \"collect-profiles-29339685-dqhp8\" (UID: \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.361472 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfxh9\" (UniqueName: \"kubernetes.io/projected/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-kube-api-access-qfxh9\") pod \"collect-profiles-29339685-dqhp8\" (UID: \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.499794 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.811442 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:45:00 crc kubenswrapper[4974]: E1013 18:45:00.811795 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:45:00 crc kubenswrapper[4974]: I1013 18:45:00.991516 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8"] Oct 13 18:45:01 crc kubenswrapper[4974]: I1013 18:45:01.893172 4974 generic.go:334] "Generic (PLEG): container finished" podID="bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd" containerID="0054e29819a35a39b42907db5712d1dddb819715f8daaceabf91bbd4f5a2e07c" exitCode=0 Oct 13 18:45:01 crc kubenswrapper[4974]: I1013 18:45:01.893245 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" event={"ID":"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd","Type":"ContainerDied","Data":"0054e29819a35a39b42907db5712d1dddb819715f8daaceabf91bbd4f5a2e07c"} Oct 13 18:45:01 crc kubenswrapper[4974]: I1013 18:45:01.893705 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" event={"ID":"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd","Type":"ContainerStarted","Data":"5eb44591de55eb540aecdd9b45d1d016fc6f4526ced2b70a1140640db5d7a145"} Oct 13 18:45:01 crc kubenswrapper[4974]: I1013 18:45:01.896195 4974 generic.go:334] "Generic (PLEG): container finished" podID="af61911c-89bc-4e8c-a327-6c1bab3c7d5d" containerID="99d3c6a65847595b17e92fdc3392ff8d4f3fdd3011c4c5cd64a70c068845afc3" exitCode=2 Oct 13 18:45:01 crc kubenswrapper[4974]: I1013 18:45:01.896227 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" event={"ID":"af61911c-89bc-4e8c-a327-6c1bab3c7d5d","Type":"ContainerDied","Data":"99d3c6a65847595b17e92fdc3392ff8d4f3fdd3011c4c5cd64a70c068845afc3"} Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.382485 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.390022 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.502678 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-ssh-key\") pod \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\" (UID: \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\") " Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.503109 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfxh9\" (UniqueName: \"kubernetes.io/projected/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-kube-api-access-qfxh9\") pod \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\" (UID: \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\") " Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.503149 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-inventory\") pod \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\" (UID: \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\") " Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.503234 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjp4f\" (UniqueName: \"kubernetes.io/projected/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-kube-api-access-qjp4f\") pod \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\" (UID: \"af61911c-89bc-4e8c-a327-6c1bab3c7d5d\") " Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.503269 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-secret-volume\") pod \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\" (UID: \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\") " Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.503305 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-config-volume\") pod \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\" (UID: \"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd\") " Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.504350 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-config-volume" (OuterVolumeSpecName: "config-volume") pod "bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd" (UID: "bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.508518 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-kube-api-access-qfxh9" (OuterVolumeSpecName: "kube-api-access-qfxh9") pod "bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd" (UID: "bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd"). InnerVolumeSpecName "kube-api-access-qfxh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.509013 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-kube-api-access-qjp4f" (OuterVolumeSpecName: "kube-api-access-qjp4f") pod "af61911c-89bc-4e8c-a327-6c1bab3c7d5d" (UID: "af61911c-89bc-4e8c-a327-6c1bab3c7d5d"). InnerVolumeSpecName "kube-api-access-qjp4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.514439 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd" (UID: "bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.532486 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-inventory" (OuterVolumeSpecName: "inventory") pod "af61911c-89bc-4e8c-a327-6c1bab3c7d5d" (UID: "af61911c-89bc-4e8c-a327-6c1bab3c7d5d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.570617 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "af61911c-89bc-4e8c-a327-6c1bab3c7d5d" (UID: "af61911c-89bc-4e8c-a327-6c1bab3c7d5d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.606020 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.606061 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfxh9\" (UniqueName: \"kubernetes.io/projected/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-kube-api-access-qfxh9\") on node \"crc\" DevicePath \"\"" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.606078 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.606091 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjp4f\" (UniqueName: \"kubernetes.io/projected/af61911c-89bc-4e8c-a327-6c1bab3c7d5d-kube-api-access-qjp4f\") on node \"crc\" DevicePath \"\"" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.606102 4974 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.606113 4974 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.918970 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.919013 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8" event={"ID":"bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd","Type":"ContainerDied","Data":"5eb44591de55eb540aecdd9b45d1d016fc6f4526ced2b70a1140640db5d7a145"} Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.919064 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5eb44591de55eb540aecdd9b45d1d016fc6f4526ced2b70a1140640db5d7a145" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.921028 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" event={"ID":"af61911c-89bc-4e8c-a327-6c1bab3c7d5d","Type":"ContainerDied","Data":"cb37382a7946acf3cac004fd63df30c43f7a97e5d78b92206567496a10877c99"} Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.921065 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh" Oct 13 18:45:03 crc kubenswrapper[4974]: I1013 18:45:03.921073 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb37382a7946acf3cac004fd63df30c43f7a97e5d78b92206567496a10877c99" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.044230 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v"] Oct 13 18:45:11 crc kubenswrapper[4974]: E1013 18:45:11.045868 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af61911c-89bc-4e8c-a327-6c1bab3c7d5d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.045915 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="af61911c-89bc-4e8c-a327-6c1bab3c7d5d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:45:11 crc kubenswrapper[4974]: E1013 18:45:11.045968 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd" containerName="collect-profiles" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.045986 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd" containerName="collect-profiles" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.046449 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd" containerName="collect-profiles" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.046490 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="af61911c-89bc-4e8c-a327-6c1bab3c7d5d" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.048071 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.053242 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.053569 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.053883 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.054228 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.056012 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v"] Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.177974 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/177f015a-482d-4058-a475-e6f787c7c1e5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v\" (UID: \"177f015a-482d-4058-a475-e6f787c7c1e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.178306 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wf9p\" (UniqueName: \"kubernetes.io/projected/177f015a-482d-4058-a475-e6f787c7c1e5-kube-api-access-2wf9p\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v\" (UID: \"177f015a-482d-4058-a475-e6f787c7c1e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.178338 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/177f015a-482d-4058-a475-e6f787c7c1e5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v\" (UID: \"177f015a-482d-4058-a475-e6f787c7c1e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.280645 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/177f015a-482d-4058-a475-e6f787c7c1e5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v\" (UID: \"177f015a-482d-4058-a475-e6f787c7c1e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.280925 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wf9p\" (UniqueName: \"kubernetes.io/projected/177f015a-482d-4058-a475-e6f787c7c1e5-kube-api-access-2wf9p\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v\" (UID: \"177f015a-482d-4058-a475-e6f787c7c1e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.280990 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/177f015a-482d-4058-a475-e6f787c7c1e5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v\" (UID: \"177f015a-482d-4058-a475-e6f787c7c1e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.288518 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/177f015a-482d-4058-a475-e6f787c7c1e5-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v\" (UID: \"177f015a-482d-4058-a475-e6f787c7c1e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.290401 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/177f015a-482d-4058-a475-e6f787c7c1e5-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v\" (UID: \"177f015a-482d-4058-a475-e6f787c7c1e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.310271 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wf9p\" (UniqueName: \"kubernetes.io/projected/177f015a-482d-4058-a475-e6f787c7c1e5-kube-api-access-2wf9p\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v\" (UID: \"177f015a-482d-4058-a475-e6f787c7c1e5\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.376482 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:45:11 crc kubenswrapper[4974]: I1013 18:45:11.934906 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v"] Oct 13 18:45:12 crc kubenswrapper[4974]: I1013 18:45:12.015576 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" event={"ID":"177f015a-482d-4058-a475-e6f787c7c1e5","Type":"ContainerStarted","Data":"0a6178f878c1827da714223426944dd1bfa3d85b7787da9422620aa4974aef45"} Oct 13 18:45:13 crc kubenswrapper[4974]: I1013 18:45:13.027994 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" event={"ID":"177f015a-482d-4058-a475-e6f787c7c1e5","Type":"ContainerStarted","Data":"83fc26b6cf0f445a4d95f838b5b31cb55e707fe1e98a51103c2f79489c3edc28"} Oct 13 18:45:13 crc kubenswrapper[4974]: I1013 18:45:13.050359 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" podStartSLOduration=1.482787408 podStartE2EDuration="2.050337811s" podCreationTimestamp="2025-10-13 18:45:11 +0000 UTC" firstStartedPulling="2025-10-13 18:45:11.944634516 +0000 UTC m=+1846.849000596" lastFinishedPulling="2025-10-13 18:45:12.512184919 +0000 UTC m=+1847.416550999" observedRunningTime="2025-10-13 18:45:13.04995844 +0000 UTC m=+1847.954324530" watchObservedRunningTime="2025-10-13 18:45:13.050337811 +0000 UTC m=+1847.954703901" Oct 13 18:45:14 crc kubenswrapper[4974]: I1013 18:45:14.811594 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:45:14 crc kubenswrapper[4974]: E1013 18:45:14.811877 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:45:29 crc kubenswrapper[4974]: I1013 18:45:29.812032 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:45:29 crc kubenswrapper[4974]: E1013 18:45:29.812818 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:45:42 crc kubenswrapper[4974]: I1013 18:45:42.812082 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:45:42 crc kubenswrapper[4974]: E1013 18:45:42.813206 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:45:57 crc kubenswrapper[4974]: I1013 18:45:57.812698 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:45:57 crc kubenswrapper[4974]: E1013 18:45:57.813475 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:46:06 crc kubenswrapper[4974]: I1013 18:46:06.638986 4974 generic.go:334] "Generic (PLEG): container finished" podID="177f015a-482d-4058-a475-e6f787c7c1e5" containerID="83fc26b6cf0f445a4d95f838b5b31cb55e707fe1e98a51103c2f79489c3edc28" exitCode=0 Oct 13 18:46:06 crc kubenswrapper[4974]: I1013 18:46:06.639075 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" event={"ID":"177f015a-482d-4058-a475-e6f787c7c1e5","Type":"ContainerDied","Data":"83fc26b6cf0f445a4d95f838b5b31cb55e707fe1e98a51103c2f79489c3edc28"} Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.155792 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.259187 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/177f015a-482d-4058-a475-e6f787c7c1e5-inventory\") pod \"177f015a-482d-4058-a475-e6f787c7c1e5\" (UID: \"177f015a-482d-4058-a475-e6f787c7c1e5\") " Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.259368 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wf9p\" (UniqueName: \"kubernetes.io/projected/177f015a-482d-4058-a475-e6f787c7c1e5-kube-api-access-2wf9p\") pod \"177f015a-482d-4058-a475-e6f787c7c1e5\" (UID: \"177f015a-482d-4058-a475-e6f787c7c1e5\") " Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.259433 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/177f015a-482d-4058-a475-e6f787c7c1e5-ssh-key\") pod \"177f015a-482d-4058-a475-e6f787c7c1e5\" (UID: \"177f015a-482d-4058-a475-e6f787c7c1e5\") " Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.266029 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/177f015a-482d-4058-a475-e6f787c7c1e5-kube-api-access-2wf9p" (OuterVolumeSpecName: "kube-api-access-2wf9p") pod "177f015a-482d-4058-a475-e6f787c7c1e5" (UID: "177f015a-482d-4058-a475-e6f787c7c1e5"). InnerVolumeSpecName "kube-api-access-2wf9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.292868 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/177f015a-482d-4058-a475-e6f787c7c1e5-inventory" (OuterVolumeSpecName: "inventory") pod "177f015a-482d-4058-a475-e6f787c7c1e5" (UID: "177f015a-482d-4058-a475-e6f787c7c1e5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.315272 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/177f015a-482d-4058-a475-e6f787c7c1e5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "177f015a-482d-4058-a475-e6f787c7c1e5" (UID: "177f015a-482d-4058-a475-e6f787c7c1e5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.362517 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/177f015a-482d-4058-a475-e6f787c7c1e5-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.362642 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wf9p\" (UniqueName: \"kubernetes.io/projected/177f015a-482d-4058-a475-e6f787c7c1e5-kube-api-access-2wf9p\") on node \"crc\" DevicePath \"\"" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.362695 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/177f015a-482d-4058-a475-e6f787c7c1e5-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.668523 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" event={"ID":"177f015a-482d-4058-a475-e6f787c7c1e5","Type":"ContainerDied","Data":"0a6178f878c1827da714223426944dd1bfa3d85b7787da9422620aa4974aef45"} Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.668585 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a6178f878c1827da714223426944dd1bfa3d85b7787da9422620aa4974aef45" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.668631 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.796078 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-k7bxc"] Oct 13 18:46:08 crc kubenswrapper[4974]: E1013 18:46:08.796950 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="177f015a-482d-4058-a475-e6f787c7c1e5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.796994 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="177f015a-482d-4058-a475-e6f787c7c1e5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.797466 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="177f015a-482d-4058-a475-e6f787c7c1e5" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.798638 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.801987 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.802276 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.802518 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.808370 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.808870 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-k7bxc"] Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.978947 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/37c5496c-447a-4806-81b1-15f11c4d057e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-k7bxc\" (UID: \"37c5496c-447a-4806-81b1-15f11c4d057e\") " pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.979766 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mclh\" (UniqueName: \"kubernetes.io/projected/37c5496c-447a-4806-81b1-15f11c4d057e-kube-api-access-7mclh\") pod \"ssh-known-hosts-edpm-deployment-k7bxc\" (UID: \"37c5496c-447a-4806-81b1-15f11c4d057e\") " pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:08 crc kubenswrapper[4974]: I1013 18:46:08.979908 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37c5496c-447a-4806-81b1-15f11c4d057e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-k7bxc\" (UID: \"37c5496c-447a-4806-81b1-15f11c4d057e\") " pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:09 crc kubenswrapper[4974]: I1013 18:46:09.082221 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/37c5496c-447a-4806-81b1-15f11c4d057e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-k7bxc\" (UID: \"37c5496c-447a-4806-81b1-15f11c4d057e\") " pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:09 crc kubenswrapper[4974]: I1013 18:46:09.082489 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mclh\" (UniqueName: \"kubernetes.io/projected/37c5496c-447a-4806-81b1-15f11c4d057e-kube-api-access-7mclh\") pod \"ssh-known-hosts-edpm-deployment-k7bxc\" (UID: \"37c5496c-447a-4806-81b1-15f11c4d057e\") " pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:09 crc kubenswrapper[4974]: I1013 18:46:09.082584 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37c5496c-447a-4806-81b1-15f11c4d057e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-k7bxc\" (UID: \"37c5496c-447a-4806-81b1-15f11c4d057e\") " pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:09 crc kubenswrapper[4974]: I1013 18:46:09.090713 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37c5496c-447a-4806-81b1-15f11c4d057e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-k7bxc\" (UID: \"37c5496c-447a-4806-81b1-15f11c4d057e\") " pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:09 crc kubenswrapper[4974]: I1013 18:46:09.092193 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/37c5496c-447a-4806-81b1-15f11c4d057e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-k7bxc\" (UID: \"37c5496c-447a-4806-81b1-15f11c4d057e\") " pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:09 crc kubenswrapper[4974]: I1013 18:46:09.116922 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mclh\" (UniqueName: \"kubernetes.io/projected/37c5496c-447a-4806-81b1-15f11c4d057e-kube-api-access-7mclh\") pod \"ssh-known-hosts-edpm-deployment-k7bxc\" (UID: \"37c5496c-447a-4806-81b1-15f11c4d057e\") " pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:09 crc kubenswrapper[4974]: I1013 18:46:09.130598 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:09 crc kubenswrapper[4974]: I1013 18:46:09.757307 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-k7bxc"] Oct 13 18:46:09 crc kubenswrapper[4974]: I1013 18:46:09.812404 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:46:09 crc kubenswrapper[4974]: E1013 18:46:09.812785 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:46:10 crc kubenswrapper[4974]: I1013 18:46:10.696209 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" event={"ID":"37c5496c-447a-4806-81b1-15f11c4d057e","Type":"ContainerStarted","Data":"e1387fd35792bfe98191a320b73baa069a3fa17c381ca5229209eb09a03b31d7"} Oct 13 18:46:10 crc kubenswrapper[4974]: I1013 18:46:10.696590 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" event={"ID":"37c5496c-447a-4806-81b1-15f11c4d057e","Type":"ContainerStarted","Data":"2026a0a395d09cb97d214ad1c52fd7ce5212d8dc69281924ea9b5448efb25ffb"} Oct 13 18:46:10 crc kubenswrapper[4974]: I1013 18:46:10.726168 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" podStartSLOduration=2.223006687 podStartE2EDuration="2.726148133s" podCreationTimestamp="2025-10-13 18:46:08 +0000 UTC" firstStartedPulling="2025-10-13 18:46:09.76100845 +0000 UTC m=+1904.665374530" lastFinishedPulling="2025-10-13 18:46:10.264149856 +0000 UTC m=+1905.168515976" observedRunningTime="2025-10-13 18:46:10.716397358 +0000 UTC m=+1905.620763458" watchObservedRunningTime="2025-10-13 18:46:10.726148133 +0000 UTC m=+1905.630514223" Oct 13 18:46:18 crc kubenswrapper[4974]: I1013 18:46:18.808053 4974 generic.go:334] "Generic (PLEG): container finished" podID="37c5496c-447a-4806-81b1-15f11c4d057e" containerID="e1387fd35792bfe98191a320b73baa069a3fa17c381ca5229209eb09a03b31d7" exitCode=0 Oct 13 18:46:18 crc kubenswrapper[4974]: I1013 18:46:18.808148 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" event={"ID":"37c5496c-447a-4806-81b1-15f11c4d057e","Type":"ContainerDied","Data":"e1387fd35792bfe98191a320b73baa069a3fa17c381ca5229209eb09a03b31d7"} Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.285741 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.452303 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/37c5496c-447a-4806-81b1-15f11c4d057e-inventory-0\") pod \"37c5496c-447a-4806-81b1-15f11c4d057e\" (UID: \"37c5496c-447a-4806-81b1-15f11c4d057e\") " Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.452646 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37c5496c-447a-4806-81b1-15f11c4d057e-ssh-key-openstack-edpm-ipam\") pod \"37c5496c-447a-4806-81b1-15f11c4d057e\" (UID: \"37c5496c-447a-4806-81b1-15f11c4d057e\") " Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.452783 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mclh\" (UniqueName: \"kubernetes.io/projected/37c5496c-447a-4806-81b1-15f11c4d057e-kube-api-access-7mclh\") pod \"37c5496c-447a-4806-81b1-15f11c4d057e\" (UID: \"37c5496c-447a-4806-81b1-15f11c4d057e\") " Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.461310 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37c5496c-447a-4806-81b1-15f11c4d057e-kube-api-access-7mclh" (OuterVolumeSpecName: "kube-api-access-7mclh") pod "37c5496c-447a-4806-81b1-15f11c4d057e" (UID: "37c5496c-447a-4806-81b1-15f11c4d057e"). InnerVolumeSpecName "kube-api-access-7mclh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.486535 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c5496c-447a-4806-81b1-15f11c4d057e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "37c5496c-447a-4806-81b1-15f11c4d057e" (UID: "37c5496c-447a-4806-81b1-15f11c4d057e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.506989 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37c5496c-447a-4806-81b1-15f11c4d057e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "37c5496c-447a-4806-81b1-15f11c4d057e" (UID: "37c5496c-447a-4806-81b1-15f11c4d057e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.555748 4974 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/37c5496c-447a-4806-81b1-15f11c4d057e-inventory-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.555790 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37c5496c-447a-4806-81b1-15f11c4d057e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.555805 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mclh\" (UniqueName: \"kubernetes.io/projected/37c5496c-447a-4806-81b1-15f11c4d057e-kube-api-access-7mclh\") on node \"crc\" DevicePath \"\"" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.830033 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" event={"ID":"37c5496c-447a-4806-81b1-15f11c4d057e","Type":"ContainerDied","Data":"2026a0a395d09cb97d214ad1c52fd7ce5212d8dc69281924ea9b5448efb25ffb"} Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.830419 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2026a0a395d09cb97d214ad1c52fd7ce5212d8dc69281924ea9b5448efb25ffb" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.830119 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-k7bxc" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.959382 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw"] Oct 13 18:46:20 crc kubenswrapper[4974]: E1013 18:46:20.959843 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37c5496c-447a-4806-81b1-15f11c4d057e" containerName="ssh-known-hosts-edpm-deployment" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.959861 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="37c5496c-447a-4806-81b1-15f11c4d057e" containerName="ssh-known-hosts-edpm-deployment" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.960034 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="37c5496c-447a-4806-81b1-15f11c4d057e" containerName="ssh-known-hosts-edpm-deployment" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.960675 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.969145 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.969396 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.969546 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.969706 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:46:20 crc kubenswrapper[4974]: I1013 18:46:20.980433 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw"] Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.065755 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/127007fe-96b5-4741-b207-af9ec05b68da-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cmccw\" (UID: \"127007fe-96b5-4741-b207-af9ec05b68da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.066085 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzv9t\" (UniqueName: \"kubernetes.io/projected/127007fe-96b5-4741-b207-af9ec05b68da-kube-api-access-vzv9t\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cmccw\" (UID: \"127007fe-96b5-4741-b207-af9ec05b68da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.066240 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/127007fe-96b5-4741-b207-af9ec05b68da-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cmccw\" (UID: \"127007fe-96b5-4741-b207-af9ec05b68da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:21 crc kubenswrapper[4974]: E1013 18:46:21.092188 4974 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37c5496c_447a_4806_81b1_15f11c4d057e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37c5496c_447a_4806_81b1_15f11c4d057e.slice/crio-2026a0a395d09cb97d214ad1c52fd7ce5212d8dc69281924ea9b5448efb25ffb\": RecentStats: unable to find data in memory cache]" Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.168718 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzv9t\" (UniqueName: \"kubernetes.io/projected/127007fe-96b5-4741-b207-af9ec05b68da-kube-api-access-vzv9t\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cmccw\" (UID: \"127007fe-96b5-4741-b207-af9ec05b68da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.168792 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/127007fe-96b5-4741-b207-af9ec05b68da-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cmccw\" (UID: \"127007fe-96b5-4741-b207-af9ec05b68da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.168970 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/127007fe-96b5-4741-b207-af9ec05b68da-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cmccw\" (UID: \"127007fe-96b5-4741-b207-af9ec05b68da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.176787 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/127007fe-96b5-4741-b207-af9ec05b68da-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cmccw\" (UID: \"127007fe-96b5-4741-b207-af9ec05b68da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.176778 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/127007fe-96b5-4741-b207-af9ec05b68da-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cmccw\" (UID: \"127007fe-96b5-4741-b207-af9ec05b68da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.191756 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzv9t\" (UniqueName: \"kubernetes.io/projected/127007fe-96b5-4741-b207-af9ec05b68da-kube-api-access-vzv9t\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-cmccw\" (UID: \"127007fe-96b5-4741-b207-af9ec05b68da\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.291830 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.674732 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw"] Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.813181 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:46:21 crc kubenswrapper[4974]: E1013 18:46:21.813838 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:46:21 crc kubenswrapper[4974]: I1013 18:46:21.854276 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" event={"ID":"127007fe-96b5-4741-b207-af9ec05b68da","Type":"ContainerStarted","Data":"3d6fa70c10386e07088a5ffd6d9aa9d021caa3b6b09238a7807e1e6b0cddc2b8"} Oct 13 18:46:22 crc kubenswrapper[4974]: I1013 18:46:22.888825 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" event={"ID":"127007fe-96b5-4741-b207-af9ec05b68da","Type":"ContainerStarted","Data":"bffcd64ea9a1adf0f4f3658802e8d424783727623772ea64e2a8fdb2190bf4a2"} Oct 13 18:46:22 crc kubenswrapper[4974]: I1013 18:46:22.926889 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" podStartSLOduration=2.455322639 podStartE2EDuration="2.926859275s" podCreationTimestamp="2025-10-13 18:46:20 +0000 UTC" firstStartedPulling="2025-10-13 18:46:21.686973981 +0000 UTC m=+1916.591340071" lastFinishedPulling="2025-10-13 18:46:22.158510627 +0000 UTC m=+1917.062876707" observedRunningTime="2025-10-13 18:46:22.917543983 +0000 UTC m=+1917.821910113" watchObservedRunningTime="2025-10-13 18:46:22.926859275 +0000 UTC m=+1917.831225385" Oct 13 18:46:33 crc kubenswrapper[4974]: I1013 18:46:33.027014 4974 generic.go:334] "Generic (PLEG): container finished" podID="127007fe-96b5-4741-b207-af9ec05b68da" containerID="bffcd64ea9a1adf0f4f3658802e8d424783727623772ea64e2a8fdb2190bf4a2" exitCode=0 Oct 13 18:46:33 crc kubenswrapper[4974]: I1013 18:46:33.027090 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" event={"ID":"127007fe-96b5-4741-b207-af9ec05b68da","Type":"ContainerDied","Data":"bffcd64ea9a1adf0f4f3658802e8d424783727623772ea64e2a8fdb2190bf4a2"} Oct 13 18:46:34 crc kubenswrapper[4974]: I1013 18:46:34.498235 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:34 crc kubenswrapper[4974]: I1013 18:46:34.604243 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/127007fe-96b5-4741-b207-af9ec05b68da-ssh-key\") pod \"127007fe-96b5-4741-b207-af9ec05b68da\" (UID: \"127007fe-96b5-4741-b207-af9ec05b68da\") " Oct 13 18:46:34 crc kubenswrapper[4974]: I1013 18:46:34.604321 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/127007fe-96b5-4741-b207-af9ec05b68da-inventory\") pod \"127007fe-96b5-4741-b207-af9ec05b68da\" (UID: \"127007fe-96b5-4741-b207-af9ec05b68da\") " Oct 13 18:46:34 crc kubenswrapper[4974]: I1013 18:46:34.604563 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzv9t\" (UniqueName: \"kubernetes.io/projected/127007fe-96b5-4741-b207-af9ec05b68da-kube-api-access-vzv9t\") pod \"127007fe-96b5-4741-b207-af9ec05b68da\" (UID: \"127007fe-96b5-4741-b207-af9ec05b68da\") " Oct 13 18:46:34 crc kubenswrapper[4974]: I1013 18:46:34.610839 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/127007fe-96b5-4741-b207-af9ec05b68da-kube-api-access-vzv9t" (OuterVolumeSpecName: "kube-api-access-vzv9t") pod "127007fe-96b5-4741-b207-af9ec05b68da" (UID: "127007fe-96b5-4741-b207-af9ec05b68da"). InnerVolumeSpecName "kube-api-access-vzv9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:46:34 crc kubenswrapper[4974]: I1013 18:46:34.656509 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/127007fe-96b5-4741-b207-af9ec05b68da-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "127007fe-96b5-4741-b207-af9ec05b68da" (UID: "127007fe-96b5-4741-b207-af9ec05b68da"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:46:34 crc kubenswrapper[4974]: I1013 18:46:34.669150 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/127007fe-96b5-4741-b207-af9ec05b68da-inventory" (OuterVolumeSpecName: "inventory") pod "127007fe-96b5-4741-b207-af9ec05b68da" (UID: "127007fe-96b5-4741-b207-af9ec05b68da"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:46:34 crc kubenswrapper[4974]: I1013 18:46:34.707438 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzv9t\" (UniqueName: \"kubernetes.io/projected/127007fe-96b5-4741-b207-af9ec05b68da-kube-api-access-vzv9t\") on node \"crc\" DevicePath \"\"" Oct 13 18:46:34 crc kubenswrapper[4974]: I1013 18:46:34.707473 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/127007fe-96b5-4741-b207-af9ec05b68da-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:46:34 crc kubenswrapper[4974]: I1013 18:46:34.707485 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/127007fe-96b5-4741-b207-af9ec05b68da-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:46:34 crc kubenswrapper[4974]: I1013 18:46:34.812070 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:46:34 crc kubenswrapper[4974]: E1013 18:46:34.812458 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.087941 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" event={"ID":"127007fe-96b5-4741-b207-af9ec05b68da","Type":"ContainerDied","Data":"3d6fa70c10386e07088a5ffd6d9aa9d021caa3b6b09238a7807e1e6b0cddc2b8"} Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.087998 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d6fa70c10386e07088a5ffd6d9aa9d021caa3b6b09238a7807e1e6b0cddc2b8" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.088087 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-cmccw" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.154747 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r"] Oct 13 18:46:35 crc kubenswrapper[4974]: E1013 18:46:35.155961 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127007fe-96b5-4741-b207-af9ec05b68da" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.156062 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="127007fe-96b5-4741-b207-af9ec05b68da" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.156420 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="127007fe-96b5-4741-b207-af9ec05b68da" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.157260 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.166485 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.167014 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.167286 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.167527 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.175115 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r"] Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.220602 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbl4q\" (UniqueName: \"kubernetes.io/projected/fc6631d2-7807-4296-8806-da8155c0992e-kube-api-access-cbl4q\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r\" (UID: \"fc6631d2-7807-4296-8806-da8155c0992e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.220976 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc6631d2-7807-4296-8806-da8155c0992e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r\" (UID: \"fc6631d2-7807-4296-8806-da8155c0992e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.221109 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc6631d2-7807-4296-8806-da8155c0992e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r\" (UID: \"fc6631d2-7807-4296-8806-da8155c0992e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.323634 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc6631d2-7807-4296-8806-da8155c0992e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r\" (UID: \"fc6631d2-7807-4296-8806-da8155c0992e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.323758 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbl4q\" (UniqueName: \"kubernetes.io/projected/fc6631d2-7807-4296-8806-da8155c0992e-kube-api-access-cbl4q\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r\" (UID: \"fc6631d2-7807-4296-8806-da8155c0992e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.324036 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc6631d2-7807-4296-8806-da8155c0992e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r\" (UID: \"fc6631d2-7807-4296-8806-da8155c0992e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.330261 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc6631d2-7807-4296-8806-da8155c0992e-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r\" (UID: \"fc6631d2-7807-4296-8806-da8155c0992e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.340394 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc6631d2-7807-4296-8806-da8155c0992e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r\" (UID: \"fc6631d2-7807-4296-8806-da8155c0992e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.350950 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbl4q\" (UniqueName: \"kubernetes.io/projected/fc6631d2-7807-4296-8806-da8155c0992e-kube-api-access-cbl4q\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r\" (UID: \"fc6631d2-7807-4296-8806-da8155c0992e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:35 crc kubenswrapper[4974]: I1013 18:46:35.484848 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:36 crc kubenswrapper[4974]: I1013 18:46:36.091032 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r"] Oct 13 18:46:37 crc kubenswrapper[4974]: I1013 18:46:37.114454 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" event={"ID":"fc6631d2-7807-4296-8806-da8155c0992e","Type":"ContainerStarted","Data":"2fdac880b2663627fa897bec751b3be772372d7e6efd8aa58f00df76a650b2b9"} Oct 13 18:46:37 crc kubenswrapper[4974]: I1013 18:46:37.114944 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" event={"ID":"fc6631d2-7807-4296-8806-da8155c0992e","Type":"ContainerStarted","Data":"559af3509a6fa7a9f816671b168b00bc830d11e69d652690a3d3e288cae511f0"} Oct 13 18:46:37 crc kubenswrapper[4974]: I1013 18:46:37.136728 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" podStartSLOduration=1.533910567 podStartE2EDuration="2.136708121s" podCreationTimestamp="2025-10-13 18:46:35 +0000 UTC" firstStartedPulling="2025-10-13 18:46:36.095493195 +0000 UTC m=+1930.999859285" lastFinishedPulling="2025-10-13 18:46:36.698290729 +0000 UTC m=+1931.602656839" observedRunningTime="2025-10-13 18:46:37.134210111 +0000 UTC m=+1932.038576231" watchObservedRunningTime="2025-10-13 18:46:37.136708121 +0000 UTC m=+1932.041074211" Oct 13 18:46:45 crc kubenswrapper[4974]: I1013 18:46:45.841045 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:46:45 crc kubenswrapper[4974]: E1013 18:46:45.844424 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:46:48 crc kubenswrapper[4974]: I1013 18:46:48.242999 4974 generic.go:334] "Generic (PLEG): container finished" podID="fc6631d2-7807-4296-8806-da8155c0992e" containerID="2fdac880b2663627fa897bec751b3be772372d7e6efd8aa58f00df76a650b2b9" exitCode=0 Oct 13 18:46:48 crc kubenswrapper[4974]: I1013 18:46:48.243109 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" event={"ID":"fc6631d2-7807-4296-8806-da8155c0992e","Type":"ContainerDied","Data":"2fdac880b2663627fa897bec751b3be772372d7e6efd8aa58f00df76a650b2b9"} Oct 13 18:46:49 crc kubenswrapper[4974]: I1013 18:46:49.764899 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:49 crc kubenswrapper[4974]: I1013 18:46:49.848988 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbl4q\" (UniqueName: \"kubernetes.io/projected/fc6631d2-7807-4296-8806-da8155c0992e-kube-api-access-cbl4q\") pod \"fc6631d2-7807-4296-8806-da8155c0992e\" (UID: \"fc6631d2-7807-4296-8806-da8155c0992e\") " Oct 13 18:46:49 crc kubenswrapper[4974]: I1013 18:46:49.849061 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc6631d2-7807-4296-8806-da8155c0992e-inventory\") pod \"fc6631d2-7807-4296-8806-da8155c0992e\" (UID: \"fc6631d2-7807-4296-8806-da8155c0992e\") " Oct 13 18:46:49 crc kubenswrapper[4974]: I1013 18:46:49.849351 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc6631d2-7807-4296-8806-da8155c0992e-ssh-key\") pod \"fc6631d2-7807-4296-8806-da8155c0992e\" (UID: \"fc6631d2-7807-4296-8806-da8155c0992e\") " Oct 13 18:46:49 crc kubenswrapper[4974]: I1013 18:46:49.854964 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc6631d2-7807-4296-8806-da8155c0992e-kube-api-access-cbl4q" (OuterVolumeSpecName: "kube-api-access-cbl4q") pod "fc6631d2-7807-4296-8806-da8155c0992e" (UID: "fc6631d2-7807-4296-8806-da8155c0992e"). InnerVolumeSpecName "kube-api-access-cbl4q". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:46:49 crc kubenswrapper[4974]: I1013 18:46:49.876637 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6631d2-7807-4296-8806-da8155c0992e-inventory" (OuterVolumeSpecName: "inventory") pod "fc6631d2-7807-4296-8806-da8155c0992e" (UID: "fc6631d2-7807-4296-8806-da8155c0992e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:46:49 crc kubenswrapper[4974]: I1013 18:46:49.879125 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc6631d2-7807-4296-8806-da8155c0992e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fc6631d2-7807-4296-8806-da8155c0992e" (UID: "fc6631d2-7807-4296-8806-da8155c0992e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:46:49 crc kubenswrapper[4974]: I1013 18:46:49.951536 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fc6631d2-7807-4296-8806-da8155c0992e-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:46:49 crc kubenswrapper[4974]: I1013 18:46:49.951573 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbl4q\" (UniqueName: \"kubernetes.io/projected/fc6631d2-7807-4296-8806-da8155c0992e-kube-api-access-cbl4q\") on node \"crc\" DevicePath \"\"" Oct 13 18:46:49 crc kubenswrapper[4974]: I1013 18:46:49.951586 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fc6631d2-7807-4296-8806-da8155c0992e-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.266193 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" event={"ID":"fc6631d2-7807-4296-8806-da8155c0992e","Type":"ContainerDied","Data":"559af3509a6fa7a9f816671b168b00bc830d11e69d652690a3d3e288cae511f0"} Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.266243 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="559af3509a6fa7a9f816671b168b00bc830d11e69d652690a3d3e288cae511f0" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.266318 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.399318 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86"] Oct 13 18:46:50 crc kubenswrapper[4974]: E1013 18:46:50.400219 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc6631d2-7807-4296-8806-da8155c0992e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.400238 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc6631d2-7807-4296-8806-da8155c0992e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.400427 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc6631d2-7807-4296-8806-da8155c0992e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.401159 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.402879 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.403163 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.403321 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.403503 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.403590 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.403638 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.406513 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.406816 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.415575 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86"] Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.459854 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.459899 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.459923 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.459955 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.460002 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.460097 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.460136 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.460175 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.460225 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.460242 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.460284 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.460304 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.460346 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czf5w\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-kube-api-access-czf5w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.460382 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562092 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562131 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562154 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562174 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562190 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czf5w\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-kube-api-access-czf5w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562224 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562275 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562300 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562320 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562347 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562370 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562407 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562422 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.562455 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.568677 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.569113 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.570014 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.570013 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.571009 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.571238 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.571256 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.572007 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.572015 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.572419 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.573425 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.584197 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czf5w\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-kube-api-access-czf5w\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.588900 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.593193 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-wpc86\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:50 crc kubenswrapper[4974]: I1013 18:46:50.719851 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:46:51 crc kubenswrapper[4974]: I1013 18:46:51.302308 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86"] Oct 13 18:46:52 crc kubenswrapper[4974]: I1013 18:46:52.286537 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" event={"ID":"d272e4d0-84bf-4909-af41-81fe1f14bfcb","Type":"ContainerStarted","Data":"65b1561606ac7f799c23e638cde6268a4f3e2f89014039568918d1851c9d43ae"} Oct 13 18:46:52 crc kubenswrapper[4974]: I1013 18:46:52.287121 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" event={"ID":"d272e4d0-84bf-4909-af41-81fe1f14bfcb","Type":"ContainerStarted","Data":"69a43957b1619ee7b847207a21a5072b7e820a0d3d5a758a0854287eee64c917"} Oct 13 18:46:52 crc kubenswrapper[4974]: I1013 18:46:52.313347 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" podStartSLOduration=1.88968868 podStartE2EDuration="2.313317895s" podCreationTimestamp="2025-10-13 18:46:50 +0000 UTC" firstStartedPulling="2025-10-13 18:46:51.304209114 +0000 UTC m=+1946.208575204" lastFinishedPulling="2025-10-13 18:46:51.727838339 +0000 UTC m=+1946.632204419" observedRunningTime="2025-10-13 18:46:52.305959277 +0000 UTC m=+1947.210325357" watchObservedRunningTime="2025-10-13 18:46:52.313317895 +0000 UTC m=+1947.217683975" Oct 13 18:46:59 crc kubenswrapper[4974]: I1013 18:46:59.811450 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:46:59 crc kubenswrapper[4974]: E1013 18:46:59.812206 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:47:13 crc kubenswrapper[4974]: I1013 18:47:13.812607 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:47:13 crc kubenswrapper[4974]: E1013 18:47:13.813781 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:47:26 crc kubenswrapper[4974]: I1013 18:47:26.812452 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:47:26 crc kubenswrapper[4974]: E1013 18:47:26.813220 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:47:38 crc kubenswrapper[4974]: I1013 18:47:38.801316 4974 generic.go:334] "Generic (PLEG): container finished" podID="d272e4d0-84bf-4909-af41-81fe1f14bfcb" containerID="65b1561606ac7f799c23e638cde6268a4f3e2f89014039568918d1851c9d43ae" exitCode=0 Oct 13 18:47:38 crc kubenswrapper[4974]: I1013 18:47:38.801424 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" event={"ID":"d272e4d0-84bf-4909-af41-81fe1f14bfcb","Type":"ContainerDied","Data":"65b1561606ac7f799c23e638cde6268a4f3e2f89014039568918d1851c9d43ae"} Oct 13 18:47:38 crc kubenswrapper[4974]: I1013 18:47:38.812577 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:47:39 crc kubenswrapper[4974]: I1013 18:47:39.825529 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"7a17bc64a82f6cca5f6f48f02d11fd723d2db77ce76e4c7c6f89c54ff5a7525c"} Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.301502 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.407829 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.407905 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-bootstrap-combined-ca-bundle\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.407954 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.407981 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-ssh-key\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.408206 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-neutron-metadata-combined-ca-bundle\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.408296 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-ovn-combined-ca-bundle\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.408332 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-nova-combined-ca-bundle\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.408395 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-libvirt-combined-ca-bundle\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.408421 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-ovn-default-certs-0\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.408455 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.408484 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czf5w\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-kube-api-access-czf5w\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.408518 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-repo-setup-combined-ca-bundle\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.408541 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-inventory\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.408603 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-telemetry-combined-ca-bundle\") pod \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\" (UID: \"d272e4d0-84bf-4909-af41-81fe1f14bfcb\") " Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.414151 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.415337 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.417801 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.418316 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.418339 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.418397 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.418636 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-kube-api-access-czf5w" (OuterVolumeSpecName: "kube-api-access-czf5w") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "kube-api-access-czf5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.419321 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.421352 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.422475 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.425774 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.430877 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.444039 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-inventory" (OuterVolumeSpecName: "inventory") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.447962 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d272e4d0-84bf-4909-af41-81fe1f14bfcb" (UID: "d272e4d0-84bf-4909-af41-81fe1f14bfcb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512050 4974 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512102 4974 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512124 4974 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512146 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512165 4974 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512183 4974 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512201 4974 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512218 4974 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512236 4974 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512257 4974 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512276 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czf5w\" (UniqueName: \"kubernetes.io/projected/d272e4d0-84bf-4909-af41-81fe1f14bfcb-kube-api-access-czf5w\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512294 4974 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512315 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.512336 4974 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d272e4d0-84bf-4909-af41-81fe1f14bfcb-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.832389 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" event={"ID":"d272e4d0-84bf-4909-af41-81fe1f14bfcb","Type":"ContainerDied","Data":"69a43957b1619ee7b847207a21a5072b7e820a0d3d5a758a0854287eee64c917"} Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.832787 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69a43957b1619ee7b847207a21a5072b7e820a0d3d5a758a0854287eee64c917" Oct 13 18:47:40 crc kubenswrapper[4974]: I1013 18:47:40.832515 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-wpc86" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.078509 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh"] Oct 13 18:47:41 crc kubenswrapper[4974]: E1013 18:47:41.079065 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d272e4d0-84bf-4909-af41-81fe1f14bfcb" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.079093 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d272e4d0-84bf-4909-af41-81fe1f14bfcb" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.081857 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d272e4d0-84bf-4909-af41-81fe1f14bfcb" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.086806 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.093775 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.095257 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.096952 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.097551 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.098082 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.105744 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh"] Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.234226 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.234331 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.234384 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ks7v\" (UniqueName: \"kubernetes.io/projected/a0a9ad72-5d41-4b79-8d65-797ed063b530-kube-api-access-2ks7v\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.234634 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.234992 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0a9ad72-5d41-4b79-8d65-797ed063b530-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.337134 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0a9ad72-5d41-4b79-8d65-797ed063b530-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.337238 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.337343 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.337410 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ks7v\" (UniqueName: \"kubernetes.io/projected/a0a9ad72-5d41-4b79-8d65-797ed063b530-kube-api-access-2ks7v\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.337499 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.338850 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0a9ad72-5d41-4b79-8d65-797ed063b530-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.345328 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.345366 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.352540 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.369477 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ks7v\" (UniqueName: \"kubernetes.io/projected/a0a9ad72-5d41-4b79-8d65-797ed063b530-kube-api-access-2ks7v\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-5wshh\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:41 crc kubenswrapper[4974]: I1013 18:47:41.433718 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:47:42 crc kubenswrapper[4974]: I1013 18:47:42.008508 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh"] Oct 13 18:47:42 crc kubenswrapper[4974]: I1013 18:47:42.889904 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" event={"ID":"a0a9ad72-5d41-4b79-8d65-797ed063b530","Type":"ContainerStarted","Data":"b2cdafc54991a07d7358c241e7ac684922cfe92055fe62be85587c4fde3ff961"} Oct 13 18:47:43 crc kubenswrapper[4974]: I1013 18:47:43.907100 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" event={"ID":"a0a9ad72-5d41-4b79-8d65-797ed063b530","Type":"ContainerStarted","Data":"c42867b8b699681f6437c408b1700d1cb93488e58ee308a43db2f6032c4ab9f1"} Oct 13 18:47:43 crc kubenswrapper[4974]: I1013 18:47:43.934458 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" podStartSLOduration=2.311640373 podStartE2EDuration="2.93444106s" podCreationTimestamp="2025-10-13 18:47:41 +0000 UTC" firstStartedPulling="2025-10-13 18:47:42.017417488 +0000 UTC m=+1996.921783568" lastFinishedPulling="2025-10-13 18:47:42.640218145 +0000 UTC m=+1997.544584255" observedRunningTime="2025-10-13 18:47:43.933554865 +0000 UTC m=+1998.837920985" watchObservedRunningTime="2025-10-13 18:47:43.93444106 +0000 UTC m=+1998.838807140" Oct 13 18:48:58 crc kubenswrapper[4974]: I1013 18:48:58.778138 4974 generic.go:334] "Generic (PLEG): container finished" podID="a0a9ad72-5d41-4b79-8d65-797ed063b530" containerID="c42867b8b699681f6437c408b1700d1cb93488e58ee308a43db2f6032c4ab9f1" exitCode=0 Oct 13 18:48:58 crc kubenswrapper[4974]: I1013 18:48:58.778232 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" event={"ID":"a0a9ad72-5d41-4b79-8d65-797ed063b530","Type":"ContainerDied","Data":"c42867b8b699681f6437c408b1700d1cb93488e58ee308a43db2f6032c4ab9f1"} Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.241879 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.244031 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-inventory\") pod \"a0a9ad72-5d41-4b79-8d65-797ed063b530\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.244080 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-ssh-key\") pod \"a0a9ad72-5d41-4b79-8d65-797ed063b530\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.244102 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-ovn-combined-ca-bundle\") pod \"a0a9ad72-5d41-4b79-8d65-797ed063b530\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.244129 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0a9ad72-5d41-4b79-8d65-797ed063b530-ovncontroller-config-0\") pod \"a0a9ad72-5d41-4b79-8d65-797ed063b530\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.244158 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ks7v\" (UniqueName: \"kubernetes.io/projected/a0a9ad72-5d41-4b79-8d65-797ed063b530-kube-api-access-2ks7v\") pod \"a0a9ad72-5d41-4b79-8d65-797ed063b530\" (UID: \"a0a9ad72-5d41-4b79-8d65-797ed063b530\") " Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.251346 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a0a9ad72-5d41-4b79-8d65-797ed063b530" (UID: "a0a9ad72-5d41-4b79-8d65-797ed063b530"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.251902 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a9ad72-5d41-4b79-8d65-797ed063b530-kube-api-access-2ks7v" (OuterVolumeSpecName: "kube-api-access-2ks7v") pod "a0a9ad72-5d41-4b79-8d65-797ed063b530" (UID: "a0a9ad72-5d41-4b79-8d65-797ed063b530"). InnerVolumeSpecName "kube-api-access-2ks7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.287699 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0a9ad72-5d41-4b79-8d65-797ed063b530-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a0a9ad72-5d41-4b79-8d65-797ed063b530" (UID: "a0a9ad72-5d41-4b79-8d65-797ed063b530"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.298229 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-inventory" (OuterVolumeSpecName: "inventory") pod "a0a9ad72-5d41-4b79-8d65-797ed063b530" (UID: "a0a9ad72-5d41-4b79-8d65-797ed063b530"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.306235 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a0a9ad72-5d41-4b79-8d65-797ed063b530" (UID: "a0a9ad72-5d41-4b79-8d65-797ed063b530"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.346874 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.346913 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.346928 4974 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0a9ad72-5d41-4b79-8d65-797ed063b530-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.346945 4974 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0a9ad72-5d41-4b79-8d65-797ed063b530-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.346957 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ks7v\" (UniqueName: \"kubernetes.io/projected/a0a9ad72-5d41-4b79-8d65-797ed063b530-kube-api-access-2ks7v\") on node \"crc\" DevicePath \"\"" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.806470 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" event={"ID":"a0a9ad72-5d41-4b79-8d65-797ed063b530","Type":"ContainerDied","Data":"b2cdafc54991a07d7358c241e7ac684922cfe92055fe62be85587c4fde3ff961"} Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.806544 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2cdafc54991a07d7358c241e7ac684922cfe92055fe62be85587c4fde3ff961" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.806570 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-5wshh" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.919973 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72"] Oct 13 18:49:00 crc kubenswrapper[4974]: E1013 18:49:00.921032 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a9ad72-5d41-4b79-8d65-797ed063b530" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.921055 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a9ad72-5d41-4b79-8d65-797ed063b530" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.921670 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a9ad72-5d41-4b79-8d65-797ed063b530" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.923135 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.928872 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.929014 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.929395 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.929482 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.929815 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.929977 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.958192 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72"] Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.960082 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.960130 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-22xdr\" (UniqueName: \"kubernetes.io/projected/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-kube-api-access-22xdr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.960215 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.960246 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.960274 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:00 crc kubenswrapper[4974]: I1013 18:49:00.960336 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.062819 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.063187 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.063289 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-22xdr\" (UniqueName: \"kubernetes.io/projected/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-kube-api-access-22xdr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.063427 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.063519 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.063607 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.072911 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.072943 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.073155 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.076268 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.078487 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.079449 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-22xdr\" (UniqueName: \"kubernetes.io/projected/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-kube-api-access-22xdr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.258319 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.604036 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72"] Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.613121 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 18:49:01 crc kubenswrapper[4974]: I1013 18:49:01.824998 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" event={"ID":"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee","Type":"ContainerStarted","Data":"5fc81581df35bb8d031c8a552c57f4dd03e1045ce671440e6d1f5c5542db0056"} Oct 13 18:49:02 crc kubenswrapper[4974]: I1013 18:49:02.841930 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" event={"ID":"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee","Type":"ContainerStarted","Data":"9a333992526baa9930c427ef7af4c25bb7893f345c5e2771be7470662a4d37ec"} Oct 13 18:49:02 crc kubenswrapper[4974]: I1013 18:49:02.870678 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" podStartSLOduration=2.140799698 podStartE2EDuration="2.870625378s" podCreationTimestamp="2025-10-13 18:49:00 +0000 UTC" firstStartedPulling="2025-10-13 18:49:01.612566932 +0000 UTC m=+2076.516933042" lastFinishedPulling="2025-10-13 18:49:02.342392632 +0000 UTC m=+2077.246758722" observedRunningTime="2025-10-13 18:49:02.862049147 +0000 UTC m=+2077.766415267" watchObservedRunningTime="2025-10-13 18:49:02.870625378 +0000 UTC m=+2077.774991498" Oct 13 18:49:08 crc kubenswrapper[4974]: I1013 18:49:08.937649 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4s46v"] Oct 13 18:49:08 crc kubenswrapper[4974]: I1013 18:49:08.946014 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:08 crc kubenswrapper[4974]: I1013 18:49:08.948590 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4s46v"] Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.068850 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-catalog-content\") pod \"certified-operators-4s46v\" (UID: \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\") " pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.068918 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-utilities\") pod \"certified-operators-4s46v\" (UID: \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\") " pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.069033 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm64l\" (UniqueName: \"kubernetes.io/projected/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-kube-api-access-jm64l\") pod \"certified-operators-4s46v\" (UID: \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\") " pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.170970 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-catalog-content\") pod \"certified-operators-4s46v\" (UID: \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\") " pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.171020 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-utilities\") pod \"certified-operators-4s46v\" (UID: \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\") " pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.171091 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jm64l\" (UniqueName: \"kubernetes.io/projected/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-kube-api-access-jm64l\") pod \"certified-operators-4s46v\" (UID: \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\") " pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.171596 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-catalog-content\") pod \"certified-operators-4s46v\" (UID: \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\") " pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.171764 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-utilities\") pod \"certified-operators-4s46v\" (UID: \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\") " pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.193322 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm64l\" (UniqueName: \"kubernetes.io/projected/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-kube-api-access-jm64l\") pod \"certified-operators-4s46v\" (UID: \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\") " pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.278503 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.600712 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4s46v"] Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.914061 4974 generic.go:334] "Generic (PLEG): container finished" podID="1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" containerID="26a44584e9c78538ab144e8c5e7e50a56d49774c29f1315d6dbb81eb01570f78" exitCode=0 Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.914103 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s46v" event={"ID":"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8","Type":"ContainerDied","Data":"26a44584e9c78538ab144e8c5e7e50a56d49774c29f1315d6dbb81eb01570f78"} Oct 13 18:49:09 crc kubenswrapper[4974]: I1013 18:49:09.914150 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s46v" event={"ID":"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8","Type":"ContainerStarted","Data":"5dddb9a2cda227ed0325acb619e1305c33c1eab47eaa7bff49a01b0d7eec9df1"} Oct 13 18:49:10 crc kubenswrapper[4974]: I1013 18:49:10.934625 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s46v" event={"ID":"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8","Type":"ContainerStarted","Data":"1ab614f52de53d791fefd61fce096f90f99477ebf415ac2d058e6aa1da7d2f1b"} Oct 13 18:49:11 crc kubenswrapper[4974]: I1013 18:49:11.947023 4974 generic.go:334] "Generic (PLEG): container finished" podID="1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" containerID="1ab614f52de53d791fefd61fce096f90f99477ebf415ac2d058e6aa1da7d2f1b" exitCode=0 Oct 13 18:49:11 crc kubenswrapper[4974]: I1013 18:49:11.947084 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s46v" event={"ID":"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8","Type":"ContainerDied","Data":"1ab614f52de53d791fefd61fce096f90f99477ebf415ac2d058e6aa1da7d2f1b"} Oct 13 18:49:12 crc kubenswrapper[4974]: I1013 18:49:12.958072 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s46v" event={"ID":"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8","Type":"ContainerStarted","Data":"1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95"} Oct 13 18:49:12 crc kubenswrapper[4974]: I1013 18:49:12.994726 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4s46v" podStartSLOduration=2.401513316 podStartE2EDuration="4.994707617s" podCreationTimestamp="2025-10-13 18:49:08 +0000 UTC" firstStartedPulling="2025-10-13 18:49:09.916612069 +0000 UTC m=+2084.820978189" lastFinishedPulling="2025-10-13 18:49:12.5098064 +0000 UTC m=+2087.414172490" observedRunningTime="2025-10-13 18:49:12.97882065 +0000 UTC m=+2087.883186760" watchObservedRunningTime="2025-10-13 18:49:12.994707617 +0000 UTC m=+2087.899073707" Oct 13 18:49:19 crc kubenswrapper[4974]: I1013 18:49:19.279517 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:19 crc kubenswrapper[4974]: I1013 18:49:19.280277 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:19 crc kubenswrapper[4974]: I1013 18:49:19.371068 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:20 crc kubenswrapper[4974]: I1013 18:49:20.099377 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:20 crc kubenswrapper[4974]: I1013 18:49:20.158080 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4s46v"] Oct 13 18:49:22 crc kubenswrapper[4974]: I1013 18:49:22.048768 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4s46v" podUID="1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" containerName="registry-server" containerID="cri-o://1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95" gracePeriod=2 Oct 13 18:49:22 crc kubenswrapper[4974]: I1013 18:49:22.541785 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:22 crc kubenswrapper[4974]: I1013 18:49:22.680892 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-utilities\") pod \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\" (UID: \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\") " Oct 13 18:49:22 crc kubenswrapper[4974]: I1013 18:49:22.681080 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-catalog-content\") pod \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\" (UID: \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\") " Oct 13 18:49:22 crc kubenswrapper[4974]: I1013 18:49:22.681114 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm64l\" (UniqueName: \"kubernetes.io/projected/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-kube-api-access-jm64l\") pod \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\" (UID: \"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8\") " Oct 13 18:49:22 crc kubenswrapper[4974]: I1013 18:49:22.682115 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-utilities" (OuterVolumeSpecName: "utilities") pod "1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" (UID: "1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:49:22 crc kubenswrapper[4974]: I1013 18:49:22.688868 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-kube-api-access-jm64l" (OuterVolumeSpecName: "kube-api-access-jm64l") pod "1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" (UID: "1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8"). InnerVolumeSpecName "kube-api-access-jm64l". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:49:22 crc kubenswrapper[4974]: I1013 18:49:22.727669 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" (UID: "1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:49:22 crc kubenswrapper[4974]: I1013 18:49:22.783823 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:49:22 crc kubenswrapper[4974]: I1013 18:49:22.784704 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:49:22 crc kubenswrapper[4974]: I1013 18:49:22.784734 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jm64l\" (UniqueName: \"kubernetes.io/projected/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8-kube-api-access-jm64l\") on node \"crc\" DevicePath \"\"" Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.064160 4974 generic.go:334] "Generic (PLEG): container finished" podID="1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" containerID="1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95" exitCode=0 Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.064210 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s46v" event={"ID":"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8","Type":"ContainerDied","Data":"1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95"} Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.064246 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4s46v" event={"ID":"1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8","Type":"ContainerDied","Data":"5dddb9a2cda227ed0325acb619e1305c33c1eab47eaa7bff49a01b0d7eec9df1"} Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.064265 4974 scope.go:117] "RemoveContainer" containerID="1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95" Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.064271 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4s46v" Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.107731 4974 scope.go:117] "RemoveContainer" containerID="1ab614f52de53d791fefd61fce096f90f99477ebf415ac2d058e6aa1da7d2f1b" Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.110260 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4s46v"] Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.119614 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4s46v"] Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.178442 4974 scope.go:117] "RemoveContainer" containerID="26a44584e9c78538ab144e8c5e7e50a56d49774c29f1315d6dbb81eb01570f78" Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.248793 4974 scope.go:117] "RemoveContainer" containerID="1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95" Oct 13 18:49:23 crc kubenswrapper[4974]: E1013 18:49:23.249438 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95\": container with ID starting with 1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95 not found: ID does not exist" containerID="1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95" Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.249468 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95"} err="failed to get container status \"1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95\": rpc error: code = NotFound desc = could not find container \"1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95\": container with ID starting with 1ad2fa4e2d89c33d5de9037659c5f13f042636db9c7b1cd75ea61794bc9b3c95 not found: ID does not exist" Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.249503 4974 scope.go:117] "RemoveContainer" containerID="1ab614f52de53d791fefd61fce096f90f99477ebf415ac2d058e6aa1da7d2f1b" Oct 13 18:49:23 crc kubenswrapper[4974]: E1013 18:49:23.249836 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ab614f52de53d791fefd61fce096f90f99477ebf415ac2d058e6aa1da7d2f1b\": container with ID starting with 1ab614f52de53d791fefd61fce096f90f99477ebf415ac2d058e6aa1da7d2f1b not found: ID does not exist" containerID="1ab614f52de53d791fefd61fce096f90f99477ebf415ac2d058e6aa1da7d2f1b" Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.249872 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ab614f52de53d791fefd61fce096f90f99477ebf415ac2d058e6aa1da7d2f1b"} err="failed to get container status \"1ab614f52de53d791fefd61fce096f90f99477ebf415ac2d058e6aa1da7d2f1b\": rpc error: code = NotFound desc = could not find container \"1ab614f52de53d791fefd61fce096f90f99477ebf415ac2d058e6aa1da7d2f1b\": container with ID starting with 1ab614f52de53d791fefd61fce096f90f99477ebf415ac2d058e6aa1da7d2f1b not found: ID does not exist" Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.249898 4974 scope.go:117] "RemoveContainer" containerID="26a44584e9c78538ab144e8c5e7e50a56d49774c29f1315d6dbb81eb01570f78" Oct 13 18:49:23 crc kubenswrapper[4974]: E1013 18:49:23.250344 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26a44584e9c78538ab144e8c5e7e50a56d49774c29f1315d6dbb81eb01570f78\": container with ID starting with 26a44584e9c78538ab144e8c5e7e50a56d49774c29f1315d6dbb81eb01570f78 not found: ID does not exist" containerID="26a44584e9c78538ab144e8c5e7e50a56d49774c29f1315d6dbb81eb01570f78" Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.250367 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26a44584e9c78538ab144e8c5e7e50a56d49774c29f1315d6dbb81eb01570f78"} err="failed to get container status \"26a44584e9c78538ab144e8c5e7e50a56d49774c29f1315d6dbb81eb01570f78\": rpc error: code = NotFound desc = could not find container \"26a44584e9c78538ab144e8c5e7e50a56d49774c29f1315d6dbb81eb01570f78\": container with ID starting with 26a44584e9c78538ab144e8c5e7e50a56d49774c29f1315d6dbb81eb01570f78 not found: ID does not exist" Oct 13 18:49:23 crc kubenswrapper[4974]: I1013 18:49:23.821565 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" path="/var/lib/kubelet/pods/1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8/volumes" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.238608 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mzhzb"] Oct 13 18:50:00 crc kubenswrapper[4974]: E1013 18:50:00.239509 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" containerName="extract-content" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.239523 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" containerName="extract-content" Oct 13 18:50:00 crc kubenswrapper[4974]: E1013 18:50:00.239542 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" containerName="registry-server" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.239547 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" containerName="registry-server" Oct 13 18:50:00 crc kubenswrapper[4974]: E1013 18:50:00.239565 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" containerName="extract-utilities" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.239571 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" containerName="extract-utilities" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.239813 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fd612f1-2a3f-4dd9-ad88-55bcf4eacdb8" containerName="registry-server" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.241637 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.257789 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mzhzb"] Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.404791 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54hvw\" (UniqueName: \"kubernetes.io/projected/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-kube-api-access-54hvw\") pod \"redhat-operators-mzhzb\" (UID: \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\") " pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.404880 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-utilities\") pod \"redhat-operators-mzhzb\" (UID: \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\") " pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.404993 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-catalog-content\") pod \"redhat-operators-mzhzb\" (UID: \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\") " pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.506119 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54hvw\" (UniqueName: \"kubernetes.io/projected/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-kube-api-access-54hvw\") pod \"redhat-operators-mzhzb\" (UID: \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\") " pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.506244 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-utilities\") pod \"redhat-operators-mzhzb\" (UID: \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\") " pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.506395 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-catalog-content\") pod \"redhat-operators-mzhzb\" (UID: \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\") " pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.506815 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-catalog-content\") pod \"redhat-operators-mzhzb\" (UID: \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\") " pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.506821 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-utilities\") pod \"redhat-operators-mzhzb\" (UID: \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\") " pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.526834 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54hvw\" (UniqueName: \"kubernetes.io/projected/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-kube-api-access-54hvw\") pod \"redhat-operators-mzhzb\" (UID: \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\") " pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:00 crc kubenswrapper[4974]: I1013 18:50:00.564727 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:01 crc kubenswrapper[4974]: I1013 18:50:01.079165 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mzhzb"] Oct 13 18:50:01 crc kubenswrapper[4974]: I1013 18:50:01.512871 4974 generic.go:334] "Generic (PLEG): container finished" podID="4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee" containerID="9a333992526baa9930c427ef7af4c25bb7893f345c5e2771be7470662a4d37ec" exitCode=0 Oct 13 18:50:01 crc kubenswrapper[4974]: I1013 18:50:01.512944 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" event={"ID":"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee","Type":"ContainerDied","Data":"9a333992526baa9930c427ef7af4c25bb7893f345c5e2771be7470662a4d37ec"} Oct 13 18:50:01 crc kubenswrapper[4974]: I1013 18:50:01.520113 4974 generic.go:334] "Generic (PLEG): container finished" podID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" containerID="feeb3abf9f1fb178d86a25a3bd518c0df6b5f6865724cee324e5f42b66541f00" exitCode=0 Oct 13 18:50:01 crc kubenswrapper[4974]: I1013 18:50:01.520157 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzhzb" event={"ID":"5c5dfa6b-69fc-4283-9f3a-f16d34088eae","Type":"ContainerDied","Data":"feeb3abf9f1fb178d86a25a3bd518c0df6b5f6865724cee324e5f42b66541f00"} Oct 13 18:50:01 crc kubenswrapper[4974]: I1013 18:50:01.520202 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzhzb" event={"ID":"5c5dfa6b-69fc-4283-9f3a-f16d34088eae","Type":"ContainerStarted","Data":"9892e65d672ce017d07f180aea7c753b348c330d765432a6d2ff84cdd248a36a"} Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.012092 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.199309 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-neutron-ovn-metadata-agent-neutron-config-0\") pod \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.199389 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-neutron-metadata-combined-ca-bundle\") pod \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.199438 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-ssh-key\") pod \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.199460 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-22xdr\" (UniqueName: \"kubernetes.io/projected/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-kube-api-access-22xdr\") pod \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.199508 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-nova-metadata-neutron-config-0\") pod \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.199563 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-inventory\") pod \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\" (UID: \"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee\") " Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.208611 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee" (UID: "4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.209169 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-kube-api-access-22xdr" (OuterVolumeSpecName: "kube-api-access-22xdr") pod "4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee" (UID: "4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee"). InnerVolumeSpecName "kube-api-access-22xdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.245791 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee" (UID: "4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.249707 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-inventory" (OuterVolumeSpecName: "inventory") pod "4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee" (UID: "4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.252966 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee" (UID: "4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.254010 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee" (UID: "4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.301879 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.301907 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-22xdr\" (UniqueName: \"kubernetes.io/projected/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-kube-api-access-22xdr\") on node \"crc\" DevicePath \"\"" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.301933 4974 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.301945 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.301959 4974 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.301971 4974 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.545975 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" event={"ID":"4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee","Type":"ContainerDied","Data":"5fc81581df35bb8d031c8a552c57f4dd03e1045ce671440e6d1f5c5542db0056"} Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.546246 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fc81581df35bb8d031c8a552c57f4dd03e1045ce671440e6d1f5c5542db0056" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.546076 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.752644 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt"] Oct 13 18:50:03 crc kubenswrapper[4974]: E1013 18:50:03.753833 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.753859 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.754109 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.754986 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.757752 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.757867 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.758030 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.758376 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.759801 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.796340 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt"] Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.809989 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.810053 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.810138 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.810173 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2xdw\" (UniqueName: \"kubernetes.io/projected/dc0e7077-837e-4e51-a095-60eed2b94a51-kube-api-access-s2xdw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.810218 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.911954 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.912006 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.912075 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.912107 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2xdw\" (UniqueName: \"kubernetes.io/projected/dc0e7077-837e-4e51-a095-60eed2b94a51-kube-api-access-s2xdw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.912153 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.917127 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.926392 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.926738 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.927218 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:03 crc kubenswrapper[4974]: I1013 18:50:03.941771 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2xdw\" (UniqueName: \"kubernetes.io/projected/dc0e7077-837e-4e51-a095-60eed2b94a51-kube-api-access-s2xdw\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-bpglt\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:04 crc kubenswrapper[4974]: I1013 18:50:04.073464 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:50:04 crc kubenswrapper[4974]: I1013 18:50:04.561686 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzhzb" event={"ID":"5c5dfa6b-69fc-4283-9f3a-f16d34088eae","Type":"ContainerStarted","Data":"f24aff877c945d02ff00ad61ce2e7868b518a28857fff4031b37d2ba330f13b5"} Oct 13 18:50:04 crc kubenswrapper[4974]: I1013 18:50:04.688159 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt"] Oct 13 18:50:05 crc kubenswrapper[4974]: I1013 18:50:05.586279 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" event={"ID":"dc0e7077-837e-4e51-a095-60eed2b94a51","Type":"ContainerStarted","Data":"09c538efe6a6cce0a400878f29c718bbcb7380f6937dad388ff3a882f3c26738"} Oct 13 18:50:06 crc kubenswrapper[4974]: I1013 18:50:06.613261 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" event={"ID":"dc0e7077-837e-4e51-a095-60eed2b94a51","Type":"ContainerStarted","Data":"e468d45b9e6d20b72aa6a98fc8ba2a2798b99b8edc1887c285c629253537501e"} Oct 13 18:50:06 crc kubenswrapper[4974]: I1013 18:50:06.642721 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" podStartSLOduration=2.851436171 podStartE2EDuration="3.642704014s" podCreationTimestamp="2025-10-13 18:50:03 +0000 UTC" firstStartedPulling="2025-10-13 18:50:04.701087633 +0000 UTC m=+2139.605453703" lastFinishedPulling="2025-10-13 18:50:05.492355456 +0000 UTC m=+2140.396721546" observedRunningTime="2025-10-13 18:50:06.633119605 +0000 UTC m=+2141.537485725" watchObservedRunningTime="2025-10-13 18:50:06.642704014 +0000 UTC m=+2141.547070094" Oct 13 18:50:07 crc kubenswrapper[4974]: I1013 18:50:07.628456 4974 generic.go:334] "Generic (PLEG): container finished" podID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" containerID="f24aff877c945d02ff00ad61ce2e7868b518a28857fff4031b37d2ba330f13b5" exitCode=0 Oct 13 18:50:07 crc kubenswrapper[4974]: I1013 18:50:07.628542 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzhzb" event={"ID":"5c5dfa6b-69fc-4283-9f3a-f16d34088eae","Type":"ContainerDied","Data":"f24aff877c945d02ff00ad61ce2e7868b518a28857fff4031b37d2ba330f13b5"} Oct 13 18:50:07 crc kubenswrapper[4974]: I1013 18:50:07.742787 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:50:07 crc kubenswrapper[4974]: I1013 18:50:07.742863 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:50:08 crc kubenswrapper[4974]: I1013 18:50:08.657546 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzhzb" event={"ID":"5c5dfa6b-69fc-4283-9f3a-f16d34088eae","Type":"ContainerStarted","Data":"5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b"} Oct 13 18:50:08 crc kubenswrapper[4974]: I1013 18:50:08.687975 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mzhzb" podStartSLOduration=2.010679831 podStartE2EDuration="8.687951518s" podCreationTimestamp="2025-10-13 18:50:00 +0000 UTC" firstStartedPulling="2025-10-13 18:50:01.522953821 +0000 UTC m=+2136.427319901" lastFinishedPulling="2025-10-13 18:50:08.200225468 +0000 UTC m=+2143.104591588" observedRunningTime="2025-10-13 18:50:08.686296671 +0000 UTC m=+2143.590662751" watchObservedRunningTime="2025-10-13 18:50:08.687951518 +0000 UTC m=+2143.592317618" Oct 13 18:50:10 crc kubenswrapper[4974]: I1013 18:50:10.565934 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:10 crc kubenswrapper[4974]: I1013 18:50:10.566427 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:11 crc kubenswrapper[4974]: I1013 18:50:11.653582 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mzhzb" podUID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" containerName="registry-server" probeResult="failure" output=< Oct 13 18:50:11 crc kubenswrapper[4974]: timeout: failed to connect service ":50051" within 1s Oct 13 18:50:11 crc kubenswrapper[4974]: > Oct 13 18:50:20 crc kubenswrapper[4974]: I1013 18:50:20.653621 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:20 crc kubenswrapper[4974]: I1013 18:50:20.736295 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:20 crc kubenswrapper[4974]: I1013 18:50:20.907333 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mzhzb"] Oct 13 18:50:21 crc kubenswrapper[4974]: I1013 18:50:21.797002 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mzhzb" podUID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" containerName="registry-server" containerID="cri-o://5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b" gracePeriod=2 Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.287359 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.468289 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54hvw\" (UniqueName: \"kubernetes.io/projected/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-kube-api-access-54hvw\") pod \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\" (UID: \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\") " Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.468382 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-utilities\") pod \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\" (UID: \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\") " Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.468436 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-catalog-content\") pod \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\" (UID: \"5c5dfa6b-69fc-4283-9f3a-f16d34088eae\") " Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.469816 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-utilities" (OuterVolumeSpecName: "utilities") pod "5c5dfa6b-69fc-4283-9f3a-f16d34088eae" (UID: "5c5dfa6b-69fc-4283-9f3a-f16d34088eae"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.475970 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-kube-api-access-54hvw" (OuterVolumeSpecName: "kube-api-access-54hvw") pod "5c5dfa6b-69fc-4283-9f3a-f16d34088eae" (UID: "5c5dfa6b-69fc-4283-9f3a-f16d34088eae"). InnerVolumeSpecName "kube-api-access-54hvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.571458 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54hvw\" (UniqueName: \"kubernetes.io/projected/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-kube-api-access-54hvw\") on node \"crc\" DevicePath \"\"" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.571508 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.582922 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5c5dfa6b-69fc-4283-9f3a-f16d34088eae" (UID: "5c5dfa6b-69fc-4283-9f3a-f16d34088eae"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.672818 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c5dfa6b-69fc-4283-9f3a-f16d34088eae-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.832939 4974 generic.go:334] "Generic (PLEG): container finished" podID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" containerID="5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b" exitCode=0 Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.833019 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzhzb" event={"ID":"5c5dfa6b-69fc-4283-9f3a-f16d34088eae","Type":"ContainerDied","Data":"5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b"} Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.833064 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mzhzb" event={"ID":"5c5dfa6b-69fc-4283-9f3a-f16d34088eae","Type":"ContainerDied","Data":"9892e65d672ce017d07f180aea7c753b348c330d765432a6d2ff84cdd248a36a"} Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.833121 4974 scope.go:117] "RemoveContainer" containerID="5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.833153 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mzhzb" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.868145 4974 scope.go:117] "RemoveContainer" containerID="f24aff877c945d02ff00ad61ce2e7868b518a28857fff4031b37d2ba330f13b5" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.890321 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mzhzb"] Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.915627 4974 scope.go:117] "RemoveContainer" containerID="feeb3abf9f1fb178d86a25a3bd518c0df6b5f6865724cee324e5f42b66541f00" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.917458 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mzhzb"] Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.957461 4974 scope.go:117] "RemoveContainer" containerID="5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b" Oct 13 18:50:22 crc kubenswrapper[4974]: E1013 18:50:22.957976 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b\": container with ID starting with 5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b not found: ID does not exist" containerID="5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.958028 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b"} err="failed to get container status \"5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b\": rpc error: code = NotFound desc = could not find container \"5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b\": container with ID starting with 5fe8950feeb7a9e9b20a715deb6e49f946fce8efca0848f54c7809df68f06a4b not found: ID does not exist" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.958061 4974 scope.go:117] "RemoveContainer" containerID="f24aff877c945d02ff00ad61ce2e7868b518a28857fff4031b37d2ba330f13b5" Oct 13 18:50:22 crc kubenswrapper[4974]: E1013 18:50:22.958474 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f24aff877c945d02ff00ad61ce2e7868b518a28857fff4031b37d2ba330f13b5\": container with ID starting with f24aff877c945d02ff00ad61ce2e7868b518a28857fff4031b37d2ba330f13b5 not found: ID does not exist" containerID="f24aff877c945d02ff00ad61ce2e7868b518a28857fff4031b37d2ba330f13b5" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.958502 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f24aff877c945d02ff00ad61ce2e7868b518a28857fff4031b37d2ba330f13b5"} err="failed to get container status \"f24aff877c945d02ff00ad61ce2e7868b518a28857fff4031b37d2ba330f13b5\": rpc error: code = NotFound desc = could not find container \"f24aff877c945d02ff00ad61ce2e7868b518a28857fff4031b37d2ba330f13b5\": container with ID starting with f24aff877c945d02ff00ad61ce2e7868b518a28857fff4031b37d2ba330f13b5 not found: ID does not exist" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.958527 4974 scope.go:117] "RemoveContainer" containerID="feeb3abf9f1fb178d86a25a3bd518c0df6b5f6865724cee324e5f42b66541f00" Oct 13 18:50:22 crc kubenswrapper[4974]: E1013 18:50:22.958873 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"feeb3abf9f1fb178d86a25a3bd518c0df6b5f6865724cee324e5f42b66541f00\": container with ID starting with feeb3abf9f1fb178d86a25a3bd518c0df6b5f6865724cee324e5f42b66541f00 not found: ID does not exist" containerID="feeb3abf9f1fb178d86a25a3bd518c0df6b5f6865724cee324e5f42b66541f00" Oct 13 18:50:22 crc kubenswrapper[4974]: I1013 18:50:22.959008 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"feeb3abf9f1fb178d86a25a3bd518c0df6b5f6865724cee324e5f42b66541f00"} err="failed to get container status \"feeb3abf9f1fb178d86a25a3bd518c0df6b5f6865724cee324e5f42b66541f00\": rpc error: code = NotFound desc = could not find container \"feeb3abf9f1fb178d86a25a3bd518c0df6b5f6865724cee324e5f42b66541f00\": container with ID starting with feeb3abf9f1fb178d86a25a3bd518c0df6b5f6865724cee324e5f42b66541f00 not found: ID does not exist" Oct 13 18:50:23 crc kubenswrapper[4974]: I1013 18:50:23.822491 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" path="/var/lib/kubelet/pods/5c5dfa6b-69fc-4283-9f3a-f16d34088eae/volumes" Oct 13 18:50:37 crc kubenswrapper[4974]: I1013 18:50:37.743157 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:50:37 crc kubenswrapper[4974]: I1013 18:50:37.744054 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:51:07 crc kubenswrapper[4974]: I1013 18:51:07.743629 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:51:07 crc kubenswrapper[4974]: I1013 18:51:07.744371 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:51:07 crc kubenswrapper[4974]: I1013 18:51:07.744444 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:51:07 crc kubenswrapper[4974]: I1013 18:51:07.745532 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7a17bc64a82f6cca5f6f48f02d11fd723d2db77ce76e4c7c6f89c54ff5a7525c"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 18:51:07 crc kubenswrapper[4974]: I1013 18:51:07.745629 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://7a17bc64a82f6cca5f6f48f02d11fd723d2db77ce76e4c7c6f89c54ff5a7525c" gracePeriod=600 Oct 13 18:51:08 crc kubenswrapper[4974]: I1013 18:51:08.369690 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="7a17bc64a82f6cca5f6f48f02d11fd723d2db77ce76e4c7c6f89c54ff5a7525c" exitCode=0 Oct 13 18:51:08 crc kubenswrapper[4974]: I1013 18:51:08.369736 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"7a17bc64a82f6cca5f6f48f02d11fd723d2db77ce76e4c7c6f89c54ff5a7525c"} Oct 13 18:51:08 crc kubenswrapper[4974]: I1013 18:51:08.370051 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1"} Oct 13 18:51:08 crc kubenswrapper[4974]: I1013 18:51:08.370077 4974 scope.go:117] "RemoveContainer" containerID="397339e75af1c2f1453d36f0f527b196810be5f601d4e237860b0e26438f11a2" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.581495 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-s4qj6"] Oct 13 18:51:12 crc kubenswrapper[4974]: E1013 18:51:12.599399 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" containerName="registry-server" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.599434 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" containerName="registry-server" Oct 13 18:51:12 crc kubenswrapper[4974]: E1013 18:51:12.599451 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" containerName="extract-content" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.599461 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" containerName="extract-content" Oct 13 18:51:12 crc kubenswrapper[4974]: E1013 18:51:12.599485 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" containerName="extract-utilities" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.599495 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" containerName="extract-utilities" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.601062 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c5dfa6b-69fc-4283-9f3a-f16d34088eae" containerName="registry-server" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.604617 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s4qj6"] Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.618001 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.784572 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4a3067b-ff40-42b9-8aa2-948a1c24998d-utilities\") pod \"community-operators-s4qj6\" (UID: \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\") " pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.784701 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4a3067b-ff40-42b9-8aa2-948a1c24998d-catalog-content\") pod \"community-operators-s4qj6\" (UID: \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\") " pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.785014 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq5jj\" (UniqueName: \"kubernetes.io/projected/f4a3067b-ff40-42b9-8aa2-948a1c24998d-kube-api-access-tq5jj\") pod \"community-operators-s4qj6\" (UID: \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\") " pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.887059 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4a3067b-ff40-42b9-8aa2-948a1c24998d-utilities\") pod \"community-operators-s4qj6\" (UID: \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\") " pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.887164 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4a3067b-ff40-42b9-8aa2-948a1c24998d-catalog-content\") pod \"community-operators-s4qj6\" (UID: \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\") " pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.887237 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq5jj\" (UniqueName: \"kubernetes.io/projected/f4a3067b-ff40-42b9-8aa2-948a1c24998d-kube-api-access-tq5jj\") pod \"community-operators-s4qj6\" (UID: \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\") " pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.888338 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4a3067b-ff40-42b9-8aa2-948a1c24998d-catalog-content\") pod \"community-operators-s4qj6\" (UID: \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\") " pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.888382 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4a3067b-ff40-42b9-8aa2-948a1c24998d-utilities\") pod \"community-operators-s4qj6\" (UID: \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\") " pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.910958 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq5jj\" (UniqueName: \"kubernetes.io/projected/f4a3067b-ff40-42b9-8aa2-948a1c24998d-kube-api-access-tq5jj\") pod \"community-operators-s4qj6\" (UID: \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\") " pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:12 crc kubenswrapper[4974]: I1013 18:51:12.945119 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:13 crc kubenswrapper[4974]: I1013 18:51:13.464079 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-s4qj6"] Oct 13 18:51:14 crc kubenswrapper[4974]: I1013 18:51:14.472407 4974 generic.go:334] "Generic (PLEG): container finished" podID="f4a3067b-ff40-42b9-8aa2-948a1c24998d" containerID="2ce22aef24b1871e05911dfae7be0f417181e7a6fa6d70d2fc8015fa584ee474" exitCode=0 Oct 13 18:51:14 crc kubenswrapper[4974]: I1013 18:51:14.472478 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4qj6" event={"ID":"f4a3067b-ff40-42b9-8aa2-948a1c24998d","Type":"ContainerDied","Data":"2ce22aef24b1871e05911dfae7be0f417181e7a6fa6d70d2fc8015fa584ee474"} Oct 13 18:51:14 crc kubenswrapper[4974]: I1013 18:51:14.472560 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4qj6" event={"ID":"f4a3067b-ff40-42b9-8aa2-948a1c24998d","Type":"ContainerStarted","Data":"69b6efb571d9015f77cb5bd2f5e8ad64bdc5db054c390d7c9231633c958e1ac0"} Oct 13 18:51:16 crc kubenswrapper[4974]: I1013 18:51:16.501058 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4qj6" event={"ID":"f4a3067b-ff40-42b9-8aa2-948a1c24998d","Type":"ContainerStarted","Data":"acf5b716dc05c29bd85f156a01ae5c9aaa999185e3c500617b1ee8e0665e1f0e"} Oct 13 18:51:17 crc kubenswrapper[4974]: I1013 18:51:17.515918 4974 generic.go:334] "Generic (PLEG): container finished" podID="f4a3067b-ff40-42b9-8aa2-948a1c24998d" containerID="acf5b716dc05c29bd85f156a01ae5c9aaa999185e3c500617b1ee8e0665e1f0e" exitCode=0 Oct 13 18:51:17 crc kubenswrapper[4974]: I1013 18:51:17.516010 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4qj6" event={"ID":"f4a3067b-ff40-42b9-8aa2-948a1c24998d","Type":"ContainerDied","Data":"acf5b716dc05c29bd85f156a01ae5c9aaa999185e3c500617b1ee8e0665e1f0e"} Oct 13 18:51:18 crc kubenswrapper[4974]: I1013 18:51:18.533264 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4qj6" event={"ID":"f4a3067b-ff40-42b9-8aa2-948a1c24998d","Type":"ContainerStarted","Data":"e73c67e360d65ca02f5229664f790549a805212681313514264c3e151cd76b97"} Oct 13 18:51:22 crc kubenswrapper[4974]: I1013 18:51:22.946692 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:22 crc kubenswrapper[4974]: I1013 18:51:22.947702 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:23 crc kubenswrapper[4974]: I1013 18:51:23.027859 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:23 crc kubenswrapper[4974]: I1013 18:51:23.063161 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-s4qj6" podStartSLOduration=7.504271756 podStartE2EDuration="11.0631319s" podCreationTimestamp="2025-10-13 18:51:12 +0000 UTC" firstStartedPulling="2025-10-13 18:51:14.474934285 +0000 UTC m=+2209.379300405" lastFinishedPulling="2025-10-13 18:51:18.033794459 +0000 UTC m=+2212.938160549" observedRunningTime="2025-10-13 18:51:18.559068495 +0000 UTC m=+2213.463434635" watchObservedRunningTime="2025-10-13 18:51:23.0631319 +0000 UTC m=+2217.967497990" Oct 13 18:51:23 crc kubenswrapper[4974]: I1013 18:51:23.652029 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:25 crc kubenswrapper[4974]: I1013 18:51:25.992303 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s4qj6"] Oct 13 18:51:25 crc kubenswrapper[4974]: I1013 18:51:25.993123 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-s4qj6" podUID="f4a3067b-ff40-42b9-8aa2-948a1c24998d" containerName="registry-server" containerID="cri-o://e73c67e360d65ca02f5229664f790549a805212681313514264c3e151cd76b97" gracePeriod=2 Oct 13 18:51:26 crc kubenswrapper[4974]: I1013 18:51:26.648386 4974 generic.go:334] "Generic (PLEG): container finished" podID="f4a3067b-ff40-42b9-8aa2-948a1c24998d" containerID="e73c67e360d65ca02f5229664f790549a805212681313514264c3e151cd76b97" exitCode=0 Oct 13 18:51:26 crc kubenswrapper[4974]: I1013 18:51:26.648491 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4qj6" event={"ID":"f4a3067b-ff40-42b9-8aa2-948a1c24998d","Type":"ContainerDied","Data":"e73c67e360d65ca02f5229664f790549a805212681313514264c3e151cd76b97"} Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.058425 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.124436 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tq5jj\" (UniqueName: \"kubernetes.io/projected/f4a3067b-ff40-42b9-8aa2-948a1c24998d-kube-api-access-tq5jj\") pod \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\" (UID: \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\") " Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.124533 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4a3067b-ff40-42b9-8aa2-948a1c24998d-utilities\") pod \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\" (UID: \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\") " Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.124756 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4a3067b-ff40-42b9-8aa2-948a1c24998d-catalog-content\") pod \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\" (UID: \"f4a3067b-ff40-42b9-8aa2-948a1c24998d\") " Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.126321 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4a3067b-ff40-42b9-8aa2-948a1c24998d-utilities" (OuterVolumeSpecName: "utilities") pod "f4a3067b-ff40-42b9-8aa2-948a1c24998d" (UID: "f4a3067b-ff40-42b9-8aa2-948a1c24998d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.131272 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4a3067b-ff40-42b9-8aa2-948a1c24998d-kube-api-access-tq5jj" (OuterVolumeSpecName: "kube-api-access-tq5jj") pod "f4a3067b-ff40-42b9-8aa2-948a1c24998d" (UID: "f4a3067b-ff40-42b9-8aa2-948a1c24998d"). InnerVolumeSpecName "kube-api-access-tq5jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.224431 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4a3067b-ff40-42b9-8aa2-948a1c24998d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4a3067b-ff40-42b9-8aa2-948a1c24998d" (UID: "f4a3067b-ff40-42b9-8aa2-948a1c24998d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.228942 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tq5jj\" (UniqueName: \"kubernetes.io/projected/f4a3067b-ff40-42b9-8aa2-948a1c24998d-kube-api-access-tq5jj\") on node \"crc\" DevicePath \"\"" Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.228976 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4a3067b-ff40-42b9-8aa2-948a1c24998d-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.229086 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4a3067b-ff40-42b9-8aa2-948a1c24998d-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.670437 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-s4qj6" event={"ID":"f4a3067b-ff40-42b9-8aa2-948a1c24998d","Type":"ContainerDied","Data":"69b6efb571d9015f77cb5bd2f5e8ad64bdc5db054c390d7c9231633c958e1ac0"} Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.670488 4974 scope.go:117] "RemoveContainer" containerID="e73c67e360d65ca02f5229664f790549a805212681313514264c3e151cd76b97" Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.670500 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-s4qj6" Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.719294 4974 scope.go:117] "RemoveContainer" containerID="acf5b716dc05c29bd85f156a01ae5c9aaa999185e3c500617b1ee8e0665e1f0e" Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.721394 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-s4qj6"] Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.735425 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-s4qj6"] Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.757861 4974 scope.go:117] "RemoveContainer" containerID="2ce22aef24b1871e05911dfae7be0f417181e7a6fa6d70d2fc8015fa584ee474" Oct 13 18:51:27 crc kubenswrapper[4974]: I1013 18:51:27.831384 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4a3067b-ff40-42b9-8aa2-948a1c24998d" path="/var/lib/kubelet/pods/f4a3067b-ff40-42b9-8aa2-948a1c24998d/volumes" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.073094 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fv6ps"] Oct 13 18:52:29 crc kubenswrapper[4974]: E1013 18:52:29.074482 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a3067b-ff40-42b9-8aa2-948a1c24998d" containerName="registry-server" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.074501 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a3067b-ff40-42b9-8aa2-948a1c24998d" containerName="registry-server" Oct 13 18:52:29 crc kubenswrapper[4974]: E1013 18:52:29.074531 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a3067b-ff40-42b9-8aa2-948a1c24998d" containerName="extract-utilities" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.074538 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a3067b-ff40-42b9-8aa2-948a1c24998d" containerName="extract-utilities" Oct 13 18:52:29 crc kubenswrapper[4974]: E1013 18:52:29.074563 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4a3067b-ff40-42b9-8aa2-948a1c24998d" containerName="extract-content" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.074572 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4a3067b-ff40-42b9-8aa2-948a1c24998d" containerName="extract-content" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.074845 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4a3067b-ff40-42b9-8aa2-948a1c24998d" containerName="registry-server" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.076603 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.092761 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv6ps"] Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.190521 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxnrn\" (UniqueName: \"kubernetes.io/projected/db3982be-c3a5-444f-a63f-16b29250e4cc-kube-api-access-jxnrn\") pod \"redhat-marketplace-fv6ps\" (UID: \"db3982be-c3a5-444f-a63f-16b29250e4cc\") " pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.190613 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db3982be-c3a5-444f-a63f-16b29250e4cc-utilities\") pod \"redhat-marketplace-fv6ps\" (UID: \"db3982be-c3a5-444f-a63f-16b29250e4cc\") " pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.190678 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db3982be-c3a5-444f-a63f-16b29250e4cc-catalog-content\") pod \"redhat-marketplace-fv6ps\" (UID: \"db3982be-c3a5-444f-a63f-16b29250e4cc\") " pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.292360 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxnrn\" (UniqueName: \"kubernetes.io/projected/db3982be-c3a5-444f-a63f-16b29250e4cc-kube-api-access-jxnrn\") pod \"redhat-marketplace-fv6ps\" (UID: \"db3982be-c3a5-444f-a63f-16b29250e4cc\") " pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.292491 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db3982be-c3a5-444f-a63f-16b29250e4cc-utilities\") pod \"redhat-marketplace-fv6ps\" (UID: \"db3982be-c3a5-444f-a63f-16b29250e4cc\") " pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.292553 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db3982be-c3a5-444f-a63f-16b29250e4cc-catalog-content\") pod \"redhat-marketplace-fv6ps\" (UID: \"db3982be-c3a5-444f-a63f-16b29250e4cc\") " pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.293032 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db3982be-c3a5-444f-a63f-16b29250e4cc-utilities\") pod \"redhat-marketplace-fv6ps\" (UID: \"db3982be-c3a5-444f-a63f-16b29250e4cc\") " pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.293152 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db3982be-c3a5-444f-a63f-16b29250e4cc-catalog-content\") pod \"redhat-marketplace-fv6ps\" (UID: \"db3982be-c3a5-444f-a63f-16b29250e4cc\") " pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.317454 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxnrn\" (UniqueName: \"kubernetes.io/projected/db3982be-c3a5-444f-a63f-16b29250e4cc-kube-api-access-jxnrn\") pod \"redhat-marketplace-fv6ps\" (UID: \"db3982be-c3a5-444f-a63f-16b29250e4cc\") " pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.403555 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:29 crc kubenswrapper[4974]: I1013 18:52:29.733500 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv6ps"] Oct 13 18:52:29 crc kubenswrapper[4974]: W1013 18:52:29.734466 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb3982be_c3a5_444f_a63f_16b29250e4cc.slice/crio-4ea4ba57a004897a9e529a3e91ebbea974b484e10d62955bc83ebfb9c912bc44 WatchSource:0}: Error finding container 4ea4ba57a004897a9e529a3e91ebbea974b484e10d62955bc83ebfb9c912bc44: Status 404 returned error can't find the container with id 4ea4ba57a004897a9e529a3e91ebbea974b484e10d62955bc83ebfb9c912bc44 Oct 13 18:52:30 crc kubenswrapper[4974]: I1013 18:52:30.415786 4974 generic.go:334] "Generic (PLEG): container finished" podID="db3982be-c3a5-444f-a63f-16b29250e4cc" containerID="d633ba3555fceba8c3b5aaf3e17870d064309a6039689537e19e98ee8e261b4e" exitCode=0 Oct 13 18:52:30 crc kubenswrapper[4974]: I1013 18:52:30.415859 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv6ps" event={"ID":"db3982be-c3a5-444f-a63f-16b29250e4cc","Type":"ContainerDied","Data":"d633ba3555fceba8c3b5aaf3e17870d064309a6039689537e19e98ee8e261b4e"} Oct 13 18:52:30 crc kubenswrapper[4974]: I1013 18:52:30.416149 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv6ps" event={"ID":"db3982be-c3a5-444f-a63f-16b29250e4cc","Type":"ContainerStarted","Data":"4ea4ba57a004897a9e529a3e91ebbea974b484e10d62955bc83ebfb9c912bc44"} Oct 13 18:52:31 crc kubenswrapper[4974]: I1013 18:52:31.427986 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv6ps" event={"ID":"db3982be-c3a5-444f-a63f-16b29250e4cc","Type":"ContainerStarted","Data":"3ed4789d1ff517124ebf529fd4be59b781eb4dbfb27f2ee7b51d7790683ca3ea"} Oct 13 18:52:32 crc kubenswrapper[4974]: I1013 18:52:32.440664 4974 generic.go:334] "Generic (PLEG): container finished" podID="db3982be-c3a5-444f-a63f-16b29250e4cc" containerID="3ed4789d1ff517124ebf529fd4be59b781eb4dbfb27f2ee7b51d7790683ca3ea" exitCode=0 Oct 13 18:52:32 crc kubenswrapper[4974]: I1013 18:52:32.440725 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv6ps" event={"ID":"db3982be-c3a5-444f-a63f-16b29250e4cc","Type":"ContainerDied","Data":"3ed4789d1ff517124ebf529fd4be59b781eb4dbfb27f2ee7b51d7790683ca3ea"} Oct 13 18:52:33 crc kubenswrapper[4974]: I1013 18:52:33.457118 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv6ps" event={"ID":"db3982be-c3a5-444f-a63f-16b29250e4cc","Type":"ContainerStarted","Data":"db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19"} Oct 13 18:52:33 crc kubenswrapper[4974]: I1013 18:52:33.482035 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fv6ps" podStartSLOduration=2.063585977 podStartE2EDuration="4.482007022s" podCreationTimestamp="2025-10-13 18:52:29 +0000 UTC" firstStartedPulling="2025-10-13 18:52:30.420757046 +0000 UTC m=+2285.325123166" lastFinishedPulling="2025-10-13 18:52:32.839178121 +0000 UTC m=+2287.743544211" observedRunningTime="2025-10-13 18:52:33.479487181 +0000 UTC m=+2288.383853261" watchObservedRunningTime="2025-10-13 18:52:33.482007022 +0000 UTC m=+2288.386373102" Oct 13 18:52:39 crc kubenswrapper[4974]: I1013 18:52:39.403994 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:39 crc kubenswrapper[4974]: I1013 18:52:39.404777 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:39 crc kubenswrapper[4974]: I1013 18:52:39.498972 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:39 crc kubenswrapper[4974]: I1013 18:52:39.617703 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:39 crc kubenswrapper[4974]: I1013 18:52:39.743560 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv6ps"] Oct 13 18:52:41 crc kubenswrapper[4974]: I1013 18:52:41.617019 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fv6ps" podUID="db3982be-c3a5-444f-a63f-16b29250e4cc" containerName="registry-server" containerID="cri-o://db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19" gracePeriod=2 Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.202904 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.323397 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db3982be-c3a5-444f-a63f-16b29250e4cc-catalog-content\") pod \"db3982be-c3a5-444f-a63f-16b29250e4cc\" (UID: \"db3982be-c3a5-444f-a63f-16b29250e4cc\") " Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.323468 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxnrn\" (UniqueName: \"kubernetes.io/projected/db3982be-c3a5-444f-a63f-16b29250e4cc-kube-api-access-jxnrn\") pod \"db3982be-c3a5-444f-a63f-16b29250e4cc\" (UID: \"db3982be-c3a5-444f-a63f-16b29250e4cc\") " Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.323541 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db3982be-c3a5-444f-a63f-16b29250e4cc-utilities\") pod \"db3982be-c3a5-444f-a63f-16b29250e4cc\" (UID: \"db3982be-c3a5-444f-a63f-16b29250e4cc\") " Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.324784 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db3982be-c3a5-444f-a63f-16b29250e4cc-utilities" (OuterVolumeSpecName: "utilities") pod "db3982be-c3a5-444f-a63f-16b29250e4cc" (UID: "db3982be-c3a5-444f-a63f-16b29250e4cc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.334144 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db3982be-c3a5-444f-a63f-16b29250e4cc-kube-api-access-jxnrn" (OuterVolumeSpecName: "kube-api-access-jxnrn") pod "db3982be-c3a5-444f-a63f-16b29250e4cc" (UID: "db3982be-c3a5-444f-a63f-16b29250e4cc"). InnerVolumeSpecName "kube-api-access-jxnrn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.336732 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db3982be-c3a5-444f-a63f-16b29250e4cc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db3982be-c3a5-444f-a63f-16b29250e4cc" (UID: "db3982be-c3a5-444f-a63f-16b29250e4cc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.425994 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db3982be-c3a5-444f-a63f-16b29250e4cc-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.426027 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxnrn\" (UniqueName: \"kubernetes.io/projected/db3982be-c3a5-444f-a63f-16b29250e4cc-kube-api-access-jxnrn\") on node \"crc\" DevicePath \"\"" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.426039 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db3982be-c3a5-444f-a63f-16b29250e4cc-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.631166 4974 generic.go:334] "Generic (PLEG): container finished" podID="db3982be-c3a5-444f-a63f-16b29250e4cc" containerID="db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19" exitCode=0 Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.631228 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fv6ps" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.631246 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv6ps" event={"ID":"db3982be-c3a5-444f-a63f-16b29250e4cc","Type":"ContainerDied","Data":"db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19"} Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.631309 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv6ps" event={"ID":"db3982be-c3a5-444f-a63f-16b29250e4cc","Type":"ContainerDied","Data":"4ea4ba57a004897a9e529a3e91ebbea974b484e10d62955bc83ebfb9c912bc44"} Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.631351 4974 scope.go:117] "RemoveContainer" containerID="db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.667894 4974 scope.go:117] "RemoveContainer" containerID="3ed4789d1ff517124ebf529fd4be59b781eb4dbfb27f2ee7b51d7790683ca3ea" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.674948 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv6ps"] Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.684724 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv6ps"] Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.705815 4974 scope.go:117] "RemoveContainer" containerID="d633ba3555fceba8c3b5aaf3e17870d064309a6039689537e19e98ee8e261b4e" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.770650 4974 scope.go:117] "RemoveContainer" containerID="db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19" Oct 13 18:52:42 crc kubenswrapper[4974]: E1013 18:52:42.771179 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19\": container with ID starting with db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19 not found: ID does not exist" containerID="db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.771210 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19"} err="failed to get container status \"db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19\": rpc error: code = NotFound desc = could not find container \"db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19\": container with ID starting with db3debb975e45878186c6f48ee5f97f731c56bd74adbea1df3f04ef517edae19 not found: ID does not exist" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.771231 4974 scope.go:117] "RemoveContainer" containerID="3ed4789d1ff517124ebf529fd4be59b781eb4dbfb27f2ee7b51d7790683ca3ea" Oct 13 18:52:42 crc kubenswrapper[4974]: E1013 18:52:42.771835 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ed4789d1ff517124ebf529fd4be59b781eb4dbfb27f2ee7b51d7790683ca3ea\": container with ID starting with 3ed4789d1ff517124ebf529fd4be59b781eb4dbfb27f2ee7b51d7790683ca3ea not found: ID does not exist" containerID="3ed4789d1ff517124ebf529fd4be59b781eb4dbfb27f2ee7b51d7790683ca3ea" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.771895 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ed4789d1ff517124ebf529fd4be59b781eb4dbfb27f2ee7b51d7790683ca3ea"} err="failed to get container status \"3ed4789d1ff517124ebf529fd4be59b781eb4dbfb27f2ee7b51d7790683ca3ea\": rpc error: code = NotFound desc = could not find container \"3ed4789d1ff517124ebf529fd4be59b781eb4dbfb27f2ee7b51d7790683ca3ea\": container with ID starting with 3ed4789d1ff517124ebf529fd4be59b781eb4dbfb27f2ee7b51d7790683ca3ea not found: ID does not exist" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.771938 4974 scope.go:117] "RemoveContainer" containerID="d633ba3555fceba8c3b5aaf3e17870d064309a6039689537e19e98ee8e261b4e" Oct 13 18:52:42 crc kubenswrapper[4974]: E1013 18:52:42.772459 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d633ba3555fceba8c3b5aaf3e17870d064309a6039689537e19e98ee8e261b4e\": container with ID starting with d633ba3555fceba8c3b5aaf3e17870d064309a6039689537e19e98ee8e261b4e not found: ID does not exist" containerID="d633ba3555fceba8c3b5aaf3e17870d064309a6039689537e19e98ee8e261b4e" Oct 13 18:52:42 crc kubenswrapper[4974]: I1013 18:52:42.772484 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d633ba3555fceba8c3b5aaf3e17870d064309a6039689537e19e98ee8e261b4e"} err="failed to get container status \"d633ba3555fceba8c3b5aaf3e17870d064309a6039689537e19e98ee8e261b4e\": rpc error: code = NotFound desc = could not find container \"d633ba3555fceba8c3b5aaf3e17870d064309a6039689537e19e98ee8e261b4e\": container with ID starting with d633ba3555fceba8c3b5aaf3e17870d064309a6039689537e19e98ee8e261b4e not found: ID does not exist" Oct 13 18:52:43 crc kubenswrapper[4974]: I1013 18:52:43.828447 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db3982be-c3a5-444f-a63f-16b29250e4cc" path="/var/lib/kubelet/pods/db3982be-c3a5-444f-a63f-16b29250e4cc/volumes" Oct 13 18:53:37 crc kubenswrapper[4974]: I1013 18:53:37.743379 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:53:37 crc kubenswrapper[4974]: I1013 18:53:37.744046 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:54:07 crc kubenswrapper[4974]: I1013 18:54:07.743472 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:54:07 crc kubenswrapper[4974]: I1013 18:54:07.744199 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:54:37 crc kubenswrapper[4974]: I1013 18:54:37.742803 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 18:54:37 crc kubenswrapper[4974]: I1013 18:54:37.743494 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 18:54:37 crc kubenswrapper[4974]: I1013 18:54:37.743565 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 18:54:37 crc kubenswrapper[4974]: I1013 18:54:37.744825 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 18:54:37 crc kubenswrapper[4974]: I1013 18:54:37.744945 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" gracePeriod=600 Oct 13 18:54:37 crc kubenswrapper[4974]: E1013 18:54:37.875183 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:54:38 crc kubenswrapper[4974]: I1013 18:54:38.025953 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" exitCode=0 Oct 13 18:54:38 crc kubenswrapper[4974]: I1013 18:54:38.026040 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1"} Oct 13 18:54:38 crc kubenswrapper[4974]: I1013 18:54:38.026271 4974 scope.go:117] "RemoveContainer" containerID="7a17bc64a82f6cca5f6f48f02d11fd723d2db77ce76e4c7c6f89c54ff5a7525c" Oct 13 18:54:38 crc kubenswrapper[4974]: I1013 18:54:38.027215 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:54:38 crc kubenswrapper[4974]: E1013 18:54:38.027767 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:54:49 crc kubenswrapper[4974]: I1013 18:54:49.812001 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:54:49 crc kubenswrapper[4974]: E1013 18:54:49.812677 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:54:59 crc kubenswrapper[4974]: I1013 18:54:59.268390 4974 generic.go:334] "Generic (PLEG): container finished" podID="dc0e7077-837e-4e51-a095-60eed2b94a51" containerID="e468d45b9e6d20b72aa6a98fc8ba2a2798b99b8edc1887c285c629253537501e" exitCode=0 Oct 13 18:54:59 crc kubenswrapper[4974]: I1013 18:54:59.268459 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" event={"ID":"dc0e7077-837e-4e51-a095-60eed2b94a51","Type":"ContainerDied","Data":"e468d45b9e6d20b72aa6a98fc8ba2a2798b99b8edc1887c285c629253537501e"} Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.730184 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.830127 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-libvirt-secret-0\") pod \"dc0e7077-837e-4e51-a095-60eed2b94a51\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.830242 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-ssh-key\") pod \"dc0e7077-837e-4e51-a095-60eed2b94a51\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.830522 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-inventory\") pod \"dc0e7077-837e-4e51-a095-60eed2b94a51\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.830609 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2xdw\" (UniqueName: \"kubernetes.io/projected/dc0e7077-837e-4e51-a095-60eed2b94a51-kube-api-access-s2xdw\") pod \"dc0e7077-837e-4e51-a095-60eed2b94a51\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.830703 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-libvirt-combined-ca-bundle\") pod \"dc0e7077-837e-4e51-a095-60eed2b94a51\" (UID: \"dc0e7077-837e-4e51-a095-60eed2b94a51\") " Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.836796 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0e7077-837e-4e51-a095-60eed2b94a51-kube-api-access-s2xdw" (OuterVolumeSpecName: "kube-api-access-s2xdw") pod "dc0e7077-837e-4e51-a095-60eed2b94a51" (UID: "dc0e7077-837e-4e51-a095-60eed2b94a51"). InnerVolumeSpecName "kube-api-access-s2xdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.842627 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "dc0e7077-837e-4e51-a095-60eed2b94a51" (UID: "dc0e7077-837e-4e51-a095-60eed2b94a51"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.859050 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "dc0e7077-837e-4e51-a095-60eed2b94a51" (UID: "dc0e7077-837e-4e51-a095-60eed2b94a51"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.865570 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-inventory" (OuterVolumeSpecName: "inventory") pod "dc0e7077-837e-4e51-a095-60eed2b94a51" (UID: "dc0e7077-837e-4e51-a095-60eed2b94a51"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.869696 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dc0e7077-837e-4e51-a095-60eed2b94a51" (UID: "dc0e7077-837e-4e51-a095-60eed2b94a51"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.933999 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.934030 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2xdw\" (UniqueName: \"kubernetes.io/projected/dc0e7077-837e-4e51-a095-60eed2b94a51-kube-api-access-s2xdw\") on node \"crc\" DevicePath \"\"" Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.934045 4974 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.934076 4974 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:55:00 crc kubenswrapper[4974]: I1013 18:55:00.934097 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dc0e7077-837e-4e51-a095-60eed2b94a51-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.293837 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" event={"ID":"dc0e7077-837e-4e51-a095-60eed2b94a51","Type":"ContainerDied","Data":"09c538efe6a6cce0a400878f29c718bbcb7380f6937dad388ff3a882f3c26738"} Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.294195 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09c538efe6a6cce0a400878f29c718bbcb7380f6937dad388ff3a882f3c26738" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.293912 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-bpglt" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.407913 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759"] Oct 13 18:55:01 crc kubenswrapper[4974]: E1013 18:55:01.408282 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db3982be-c3a5-444f-a63f-16b29250e4cc" containerName="registry-server" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.408301 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="db3982be-c3a5-444f-a63f-16b29250e4cc" containerName="registry-server" Oct 13 18:55:01 crc kubenswrapper[4974]: E1013 18:55:01.408320 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db3982be-c3a5-444f-a63f-16b29250e4cc" containerName="extract-utilities" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.408329 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="db3982be-c3a5-444f-a63f-16b29250e4cc" containerName="extract-utilities" Oct 13 18:55:01 crc kubenswrapper[4974]: E1013 18:55:01.408348 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db3982be-c3a5-444f-a63f-16b29250e4cc" containerName="extract-content" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.408356 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="db3982be-c3a5-444f-a63f-16b29250e4cc" containerName="extract-content" Oct 13 18:55:01 crc kubenswrapper[4974]: E1013 18:55:01.408382 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0e7077-837e-4e51-a095-60eed2b94a51" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.408389 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0e7077-837e-4e51-a095-60eed2b94a51" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.408840 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="db3982be-c3a5-444f-a63f-16b29250e4cc" containerName="registry-server" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.408856 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc0e7077-837e-4e51-a095-60eed2b94a51" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.409644 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.415484 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.415724 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.415881 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.416500 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.417187 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.417278 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.417432 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.422706 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759"] Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.545587 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvtvg\" (UniqueName: \"kubernetes.io/projected/533bff4f-cd80-4893-95e8-404276a2e0d0-kube-api-access-hvtvg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.545721 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.545752 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.545781 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.545802 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.545917 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.545969 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.546027 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.546059 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.647785 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.647880 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.648469 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.648512 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.648567 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvtvg\" (UniqueName: \"kubernetes.io/projected/533bff4f-cd80-4893-95e8-404276a2e0d0-kube-api-access-hvtvg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.649018 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.649050 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.649082 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.649368 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.650219 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.652251 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.652362 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.652411 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.653721 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.654357 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.654719 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.656322 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.670700 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvtvg\" (UniqueName: \"kubernetes.io/projected/533bff4f-cd80-4893-95e8-404276a2e0d0-kube-api-access-hvtvg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-mj759\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:01 crc kubenswrapper[4974]: I1013 18:55:01.736348 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:55:02 crc kubenswrapper[4974]: I1013 18:55:02.403285 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759"] Oct 13 18:55:02 crc kubenswrapper[4974]: I1013 18:55:02.412974 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 18:55:03 crc kubenswrapper[4974]: I1013 18:55:03.314249 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" event={"ID":"533bff4f-cd80-4893-95e8-404276a2e0d0","Type":"ContainerStarted","Data":"57f9f445605a1e1838ca74e5935c78ae9141e809574ba3f4b914b069f3be0134"} Oct 13 18:55:03 crc kubenswrapper[4974]: I1013 18:55:03.315072 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" event={"ID":"533bff4f-cd80-4893-95e8-404276a2e0d0","Type":"ContainerStarted","Data":"3f4df6cd67548e11dee39dd834d4cfbdbff383470896982815d263083c780504"} Oct 13 18:55:03 crc kubenswrapper[4974]: I1013 18:55:03.330618 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" podStartSLOduration=1.769264732 podStartE2EDuration="2.330599747s" podCreationTimestamp="2025-10-13 18:55:01 +0000 UTC" firstStartedPulling="2025-10-13 18:55:02.41272525 +0000 UTC m=+2437.317091350" lastFinishedPulling="2025-10-13 18:55:02.974060245 +0000 UTC m=+2437.878426365" observedRunningTime="2025-10-13 18:55:03.327449508 +0000 UTC m=+2438.231815618" watchObservedRunningTime="2025-10-13 18:55:03.330599747 +0000 UTC m=+2438.234965837" Oct 13 18:55:04 crc kubenswrapper[4974]: I1013 18:55:04.812567 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:55:04 crc kubenswrapper[4974]: E1013 18:55:04.813382 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:55:16 crc kubenswrapper[4974]: I1013 18:55:16.812154 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:55:16 crc kubenswrapper[4974]: E1013 18:55:16.813157 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:55:30 crc kubenswrapper[4974]: I1013 18:55:30.812064 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:55:30 crc kubenswrapper[4974]: E1013 18:55:30.812889 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:55:45 crc kubenswrapper[4974]: I1013 18:55:45.830045 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:55:45 crc kubenswrapper[4974]: E1013 18:55:45.831291 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:55:56 crc kubenswrapper[4974]: I1013 18:55:56.812627 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:55:56 crc kubenswrapper[4974]: E1013 18:55:56.813414 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:56:11 crc kubenswrapper[4974]: I1013 18:56:11.817258 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:56:11 crc kubenswrapper[4974]: E1013 18:56:11.818081 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:56:22 crc kubenswrapper[4974]: I1013 18:56:22.811848 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:56:22 crc kubenswrapper[4974]: E1013 18:56:22.812860 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:56:36 crc kubenswrapper[4974]: I1013 18:56:36.811858 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:56:36 crc kubenswrapper[4974]: E1013 18:56:36.812809 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:56:51 crc kubenswrapper[4974]: I1013 18:56:51.811523 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:56:51 crc kubenswrapper[4974]: E1013 18:56:51.812459 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:57:06 crc kubenswrapper[4974]: I1013 18:57:06.811259 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:57:06 crc kubenswrapper[4974]: E1013 18:57:06.812248 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:57:19 crc kubenswrapper[4974]: I1013 18:57:19.813416 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:57:19 crc kubenswrapper[4974]: E1013 18:57:19.815751 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:57:31 crc kubenswrapper[4974]: I1013 18:57:31.814060 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:57:31 crc kubenswrapper[4974]: E1013 18:57:31.815161 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:57:42 crc kubenswrapper[4974]: I1013 18:57:42.812974 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:57:42 crc kubenswrapper[4974]: E1013 18:57:42.815596 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:57:54 crc kubenswrapper[4974]: I1013 18:57:54.812098 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:57:54 crc kubenswrapper[4974]: E1013 18:57:54.813078 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:58:07 crc kubenswrapper[4974]: I1013 18:58:07.813120 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:58:07 crc kubenswrapper[4974]: E1013 18:58:07.813969 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:58:18 crc kubenswrapper[4974]: I1013 18:58:18.811807 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:58:18 crc kubenswrapper[4974]: E1013 18:58:18.812628 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:58:33 crc kubenswrapper[4974]: I1013 18:58:33.812701 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:58:33 crc kubenswrapper[4974]: E1013 18:58:33.814153 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:58:46 crc kubenswrapper[4974]: I1013 18:58:46.812217 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:58:46 crc kubenswrapper[4974]: E1013 18:58:46.813032 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:58:57 crc kubenswrapper[4974]: I1013 18:58:57.812149 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:58:57 crc kubenswrapper[4974]: E1013 18:58:57.813154 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:59:01 crc kubenswrapper[4974]: I1013 18:59:01.093092 4974 generic.go:334] "Generic (PLEG): container finished" podID="533bff4f-cd80-4893-95e8-404276a2e0d0" containerID="57f9f445605a1e1838ca74e5935c78ae9141e809574ba3f4b914b069f3be0134" exitCode=0 Oct 13 18:59:01 crc kubenswrapper[4974]: I1013 18:59:01.093221 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" event={"ID":"533bff4f-cd80-4893-95e8-404276a2e0d0","Type":"ContainerDied","Data":"57f9f445605a1e1838ca74e5935c78ae9141e809574ba3f4b914b069f3be0134"} Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.516725 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.655401 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-migration-ssh-key-0\") pod \"533bff4f-cd80-4893-95e8-404276a2e0d0\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.655844 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-cell1-compute-config-1\") pod \"533bff4f-cd80-4893-95e8-404276a2e0d0\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.655963 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-extra-config-0\") pod \"533bff4f-cd80-4893-95e8-404276a2e0d0\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.656040 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-cell1-compute-config-0\") pod \"533bff4f-cd80-4893-95e8-404276a2e0d0\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.656124 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-ssh-key\") pod \"533bff4f-cd80-4893-95e8-404276a2e0d0\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.656169 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-migration-ssh-key-1\") pod \"533bff4f-cd80-4893-95e8-404276a2e0d0\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.656211 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-inventory\") pod \"533bff4f-cd80-4893-95e8-404276a2e0d0\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.656231 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-combined-ca-bundle\") pod \"533bff4f-cd80-4893-95e8-404276a2e0d0\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.656267 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvtvg\" (UniqueName: \"kubernetes.io/projected/533bff4f-cd80-4893-95e8-404276a2e0d0-kube-api-access-hvtvg\") pod \"533bff4f-cd80-4893-95e8-404276a2e0d0\" (UID: \"533bff4f-cd80-4893-95e8-404276a2e0d0\") " Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.678496 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "533bff4f-cd80-4893-95e8-404276a2e0d0" (UID: "533bff4f-cd80-4893-95e8-404276a2e0d0"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.678524 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/533bff4f-cd80-4893-95e8-404276a2e0d0-kube-api-access-hvtvg" (OuterVolumeSpecName: "kube-api-access-hvtvg") pod "533bff4f-cd80-4893-95e8-404276a2e0d0" (UID: "533bff4f-cd80-4893-95e8-404276a2e0d0"). InnerVolumeSpecName "kube-api-access-hvtvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.683858 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "533bff4f-cd80-4893-95e8-404276a2e0d0" (UID: "533bff4f-cd80-4893-95e8-404276a2e0d0"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.707062 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "533bff4f-cd80-4893-95e8-404276a2e0d0" (UID: "533bff4f-cd80-4893-95e8-404276a2e0d0"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.708346 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "533bff4f-cd80-4893-95e8-404276a2e0d0" (UID: "533bff4f-cd80-4893-95e8-404276a2e0d0"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.708582 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "533bff4f-cd80-4893-95e8-404276a2e0d0" (UID: "533bff4f-cd80-4893-95e8-404276a2e0d0"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.708882 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "533bff4f-cd80-4893-95e8-404276a2e0d0" (UID: "533bff4f-cd80-4893-95e8-404276a2e0d0"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.709053 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "533bff4f-cd80-4893-95e8-404276a2e0d0" (UID: "533bff4f-cd80-4893-95e8-404276a2e0d0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.709944 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-inventory" (OuterVolumeSpecName: "inventory") pod "533bff4f-cd80-4893-95e8-404276a2e0d0" (UID: "533bff4f-cd80-4893-95e8-404276a2e0d0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.758172 4974 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.758207 4974 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.758217 4974 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.758225 4974 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.758237 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.758245 4974 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.758255 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.758266 4974 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/533bff4f-cd80-4893-95e8-404276a2e0d0-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 18:59:02 crc kubenswrapper[4974]: I1013 18:59:02.758274 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvtvg\" (UniqueName: \"kubernetes.io/projected/533bff4f-cd80-4893-95e8-404276a2e0d0-kube-api-access-hvtvg\") on node \"crc\" DevicePath \"\"" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.119946 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" event={"ID":"533bff4f-cd80-4893-95e8-404276a2e0d0","Type":"ContainerDied","Data":"3f4df6cd67548e11dee39dd834d4cfbdbff383470896982815d263083c780504"} Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.120030 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f4df6cd67548e11dee39dd834d4cfbdbff383470896982815d263083c780504" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.120044 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-mj759" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.242869 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr"] Oct 13 18:59:03 crc kubenswrapper[4974]: E1013 18:59:03.243481 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="533bff4f-cd80-4893-95e8-404276a2e0d0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.243508 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="533bff4f-cd80-4893-95e8-404276a2e0d0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.243965 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="533bff4f-cd80-4893-95e8-404276a2e0d0" containerName="nova-edpm-deployment-openstack-edpm-ipam" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.245122 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.280783 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rd672" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.280792 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.280865 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.281020 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.281167 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.282727 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.282760 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.282846 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.282895 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.282927 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.282953 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.283032 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdcmm\" (UniqueName: \"kubernetes.io/projected/a947ab95-2720-4cda-a618-470943b7443c-kube-api-access-bdcmm\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.329814 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr"] Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.384837 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.384887 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.384965 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.385007 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.385039 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.385065 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.385125 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdcmm\" (UniqueName: \"kubernetes.io/projected/a947ab95-2720-4cda-a618-470943b7443c-kube-api-access-bdcmm\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.388534 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.389142 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.389290 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.389422 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.391106 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.391803 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.400449 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdcmm\" (UniqueName: \"kubernetes.io/projected/a947ab95-2720-4cda-a618-470943b7443c-kube-api-access-bdcmm\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:03 crc kubenswrapper[4974]: I1013 18:59:03.610885 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 18:59:04 crc kubenswrapper[4974]: I1013 18:59:04.218983 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr"] Oct 13 18:59:05 crc kubenswrapper[4974]: I1013 18:59:05.142249 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" event={"ID":"a947ab95-2720-4cda-a618-470943b7443c","Type":"ContainerStarted","Data":"a5c1ef7ccc98b39626619aac357fa5ff8fee5bf58cc461d6337a7d502ced2094"} Oct 13 18:59:05 crc kubenswrapper[4974]: I1013 18:59:05.142619 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" event={"ID":"a947ab95-2720-4cda-a618-470943b7443c","Type":"ContainerStarted","Data":"9b80ef27c737abc2b27c77b84408c0e984e0645f33e683ded28fb7fddae415fc"} Oct 13 18:59:05 crc kubenswrapper[4974]: I1013 18:59:05.172151 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" podStartSLOduration=1.601029984 podStartE2EDuration="2.172128373s" podCreationTimestamp="2025-10-13 18:59:03 +0000 UTC" firstStartedPulling="2025-10-13 18:59:04.230911621 +0000 UTC m=+2679.135277711" lastFinishedPulling="2025-10-13 18:59:04.80201001 +0000 UTC m=+2679.706376100" observedRunningTime="2025-10-13 18:59:05.160837435 +0000 UTC m=+2680.065203515" watchObservedRunningTime="2025-10-13 18:59:05.172128373 +0000 UTC m=+2680.076494463" Oct 13 18:59:09 crc kubenswrapper[4974]: I1013 18:59:09.812606 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:59:09 crc kubenswrapper[4974]: E1013 18:59:09.813431 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:59:22 crc kubenswrapper[4974]: I1013 18:59:22.811858 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:59:22 crc kubenswrapper[4974]: E1013 18:59:22.812731 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:59:36 crc kubenswrapper[4974]: I1013 18:59:36.812722 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:59:36 crc kubenswrapper[4974]: E1013 18:59:36.813878 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 18:59:51 crc kubenswrapper[4974]: I1013 18:59:51.813037 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 18:59:52 crc kubenswrapper[4974]: I1013 18:59:52.745410 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"47cb04159faadae1b5ae1bef9861f9e72b533c227670e91c75a48fa646414e60"} Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.164441 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88"] Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.167434 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.172247 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.172434 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.181995 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88"] Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.275230 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/881aeb7e-48b8-49e3-8b72-3fc27951a12d-config-volume\") pod \"collect-profiles-29339700-vbj88\" (UID: \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.275493 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/881aeb7e-48b8-49e3-8b72-3fc27951a12d-secret-volume\") pod \"collect-profiles-29339700-vbj88\" (UID: \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.275577 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvlkf\" (UniqueName: \"kubernetes.io/projected/881aeb7e-48b8-49e3-8b72-3fc27951a12d-kube-api-access-hvlkf\") pod \"collect-profiles-29339700-vbj88\" (UID: \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.376997 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/881aeb7e-48b8-49e3-8b72-3fc27951a12d-config-volume\") pod \"collect-profiles-29339700-vbj88\" (UID: \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.377096 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/881aeb7e-48b8-49e3-8b72-3fc27951a12d-secret-volume\") pod \"collect-profiles-29339700-vbj88\" (UID: \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.377133 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvlkf\" (UniqueName: \"kubernetes.io/projected/881aeb7e-48b8-49e3-8b72-3fc27951a12d-kube-api-access-hvlkf\") pod \"collect-profiles-29339700-vbj88\" (UID: \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.378225 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/881aeb7e-48b8-49e3-8b72-3fc27951a12d-config-volume\") pod \"collect-profiles-29339700-vbj88\" (UID: \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.384621 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/881aeb7e-48b8-49e3-8b72-3fc27951a12d-secret-volume\") pod \"collect-profiles-29339700-vbj88\" (UID: \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.400191 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvlkf\" (UniqueName: \"kubernetes.io/projected/881aeb7e-48b8-49e3-8b72-3fc27951a12d-kube-api-access-hvlkf\") pod \"collect-profiles-29339700-vbj88\" (UID: \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.503060 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:00 crc kubenswrapper[4974]: I1013 19:00:00.965217 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88"] Oct 13 19:00:01 crc kubenswrapper[4974]: I1013 19:00:01.845768 4974 generic.go:334] "Generic (PLEG): container finished" podID="881aeb7e-48b8-49e3-8b72-3fc27951a12d" containerID="cf3ec6edb505ad49f02a0767ca960dbdf952cdd9a64be55d447881ebcef746f4" exitCode=0 Oct 13 19:00:01 crc kubenswrapper[4974]: I1013 19:00:01.845836 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" event={"ID":"881aeb7e-48b8-49e3-8b72-3fc27951a12d","Type":"ContainerDied","Data":"cf3ec6edb505ad49f02a0767ca960dbdf952cdd9a64be55d447881ebcef746f4"} Oct 13 19:00:01 crc kubenswrapper[4974]: I1013 19:00:01.846119 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" event={"ID":"881aeb7e-48b8-49e3-8b72-3fc27951a12d","Type":"ContainerStarted","Data":"d9bc96b43e0606a5b10f06b05c8f6e11919778669be18b83e7caaa3aa9ea9bec"} Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.220421 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.328319 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvlkf\" (UniqueName: \"kubernetes.io/projected/881aeb7e-48b8-49e3-8b72-3fc27951a12d-kube-api-access-hvlkf\") pod \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\" (UID: \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\") " Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.328578 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/881aeb7e-48b8-49e3-8b72-3fc27951a12d-config-volume\") pod \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\" (UID: \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\") " Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.328671 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/881aeb7e-48b8-49e3-8b72-3fc27951a12d-secret-volume\") pod \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\" (UID: \"881aeb7e-48b8-49e3-8b72-3fc27951a12d\") " Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.329761 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/881aeb7e-48b8-49e3-8b72-3fc27951a12d-config-volume" (OuterVolumeSpecName: "config-volume") pod "881aeb7e-48b8-49e3-8b72-3fc27951a12d" (UID: "881aeb7e-48b8-49e3-8b72-3fc27951a12d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.334881 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/881aeb7e-48b8-49e3-8b72-3fc27951a12d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "881aeb7e-48b8-49e3-8b72-3fc27951a12d" (UID: "881aeb7e-48b8-49e3-8b72-3fc27951a12d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.336618 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/881aeb7e-48b8-49e3-8b72-3fc27951a12d-kube-api-access-hvlkf" (OuterVolumeSpecName: "kube-api-access-hvlkf") pod "881aeb7e-48b8-49e3-8b72-3fc27951a12d" (UID: "881aeb7e-48b8-49e3-8b72-3fc27951a12d"). InnerVolumeSpecName "kube-api-access-hvlkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.431094 4974 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/881aeb7e-48b8-49e3-8b72-3fc27951a12d-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.431127 4974 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/881aeb7e-48b8-49e3-8b72-3fc27951a12d-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.431137 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvlkf\" (UniqueName: \"kubernetes.io/projected/881aeb7e-48b8-49e3-8b72-3fc27951a12d-kube-api-access-hvlkf\") on node \"crc\" DevicePath \"\"" Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.867908 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" event={"ID":"881aeb7e-48b8-49e3-8b72-3fc27951a12d","Type":"ContainerDied","Data":"d9bc96b43e0606a5b10f06b05c8f6e11919778669be18b83e7caaa3aa9ea9bec"} Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.867947 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9bc96b43e0606a5b10f06b05c8f6e11919778669be18b83e7caaa3aa9ea9bec" Oct 13 19:00:03 crc kubenswrapper[4974]: I1013 19:00:03.867996 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88" Oct 13 19:00:04 crc kubenswrapper[4974]: I1013 19:00:04.291977 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh"] Oct 13 19:00:04 crc kubenswrapper[4974]: I1013 19:00:04.299047 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339655-svflh"] Oct 13 19:00:05 crc kubenswrapper[4974]: I1013 19:00:05.840849 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7429c3e4-2ad8-4373-807a-b69a11868c49" path="/var/lib/kubelet/pods/7429c3e4-2ad8-4373-807a-b69a11868c49/volumes" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.661392 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ppzfn"] Oct 13 19:00:38 crc kubenswrapper[4974]: E1013 19:00:38.662368 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="881aeb7e-48b8-49e3-8b72-3fc27951a12d" containerName="collect-profiles" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.662381 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="881aeb7e-48b8-49e3-8b72-3fc27951a12d" containerName="collect-profiles" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.662560 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="881aeb7e-48b8-49e3-8b72-3fc27951a12d" containerName="collect-profiles" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.663948 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.718854 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml5c8\" (UniqueName: \"kubernetes.io/projected/6f600a0a-5d64-464e-94cf-299ebe651364-kube-api-access-ml5c8\") pod \"certified-operators-ppzfn\" (UID: \"6f600a0a-5d64-464e-94cf-299ebe651364\") " pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.718982 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f600a0a-5d64-464e-94cf-299ebe651364-catalog-content\") pod \"certified-operators-ppzfn\" (UID: \"6f600a0a-5d64-464e-94cf-299ebe651364\") " pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.719009 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f600a0a-5d64-464e-94cf-299ebe651364-utilities\") pod \"certified-operators-ppzfn\" (UID: \"6f600a0a-5d64-464e-94cf-299ebe651364\") " pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.738279 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ppzfn"] Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.820089 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml5c8\" (UniqueName: \"kubernetes.io/projected/6f600a0a-5d64-464e-94cf-299ebe651364-kube-api-access-ml5c8\") pod \"certified-operators-ppzfn\" (UID: \"6f600a0a-5d64-464e-94cf-299ebe651364\") " pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.820389 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f600a0a-5d64-464e-94cf-299ebe651364-catalog-content\") pod \"certified-operators-ppzfn\" (UID: \"6f600a0a-5d64-464e-94cf-299ebe651364\") " pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.820480 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f600a0a-5d64-464e-94cf-299ebe651364-utilities\") pod \"certified-operators-ppzfn\" (UID: \"6f600a0a-5d64-464e-94cf-299ebe651364\") " pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.820986 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f600a0a-5d64-464e-94cf-299ebe651364-utilities\") pod \"certified-operators-ppzfn\" (UID: \"6f600a0a-5d64-464e-94cf-299ebe651364\") " pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.822081 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f600a0a-5d64-464e-94cf-299ebe651364-catalog-content\") pod \"certified-operators-ppzfn\" (UID: \"6f600a0a-5d64-464e-94cf-299ebe651364\") " pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:38 crc kubenswrapper[4974]: I1013 19:00:38.841319 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml5c8\" (UniqueName: \"kubernetes.io/projected/6f600a0a-5d64-464e-94cf-299ebe651364-kube-api-access-ml5c8\") pod \"certified-operators-ppzfn\" (UID: \"6f600a0a-5d64-464e-94cf-299ebe651364\") " pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:39 crc kubenswrapper[4974]: I1013 19:00:39.052713 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:39 crc kubenswrapper[4974]: I1013 19:00:39.610369 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ppzfn"] Oct 13 19:00:40 crc kubenswrapper[4974]: I1013 19:00:40.327349 4974 generic.go:334] "Generic (PLEG): container finished" podID="6f600a0a-5d64-464e-94cf-299ebe651364" containerID="5a747ad674834a66b18cf35fbf585d20d59329b322da779d7c0592c804be8ab1" exitCode=0 Oct 13 19:00:40 crc kubenswrapper[4974]: I1013 19:00:40.327469 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppzfn" event={"ID":"6f600a0a-5d64-464e-94cf-299ebe651364","Type":"ContainerDied","Data":"5a747ad674834a66b18cf35fbf585d20d59329b322da779d7c0592c804be8ab1"} Oct 13 19:00:40 crc kubenswrapper[4974]: I1013 19:00:40.328043 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppzfn" event={"ID":"6f600a0a-5d64-464e-94cf-299ebe651364","Type":"ContainerStarted","Data":"3cfa4e94589cbe536df70d5fa6965f43af346b97cb25ecf21259da49b7609d62"} Oct 13 19:00:40 crc kubenswrapper[4974]: I1013 19:00:40.333395 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 19:00:42 crc kubenswrapper[4974]: I1013 19:00:42.354537 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppzfn" event={"ID":"6f600a0a-5d64-464e-94cf-299ebe651364","Type":"ContainerStarted","Data":"6ba0e606910c65f437d751de7f4e59f56ce61e644c0cefa2faa5de91a8715740"} Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.369403 4974 generic.go:334] "Generic (PLEG): container finished" podID="6f600a0a-5d64-464e-94cf-299ebe651364" containerID="6ba0e606910c65f437d751de7f4e59f56ce61e644c0cefa2faa5de91a8715740" exitCode=0 Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.369515 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppzfn" event={"ID":"6f600a0a-5d64-464e-94cf-299ebe651364","Type":"ContainerDied","Data":"6ba0e606910c65f437d751de7f4e59f56ce61e644c0cefa2faa5de91a8715740"} Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.637029 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5f65l"] Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.638950 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.652596 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5f65l"] Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.726642 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d24090-2cc9-40be-b14b-829288acfee7-utilities\") pod \"redhat-operators-5f65l\" (UID: \"c9d24090-2cc9-40be-b14b-829288acfee7\") " pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.726738 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6mjv\" (UniqueName: \"kubernetes.io/projected/c9d24090-2cc9-40be-b14b-829288acfee7-kube-api-access-c6mjv\") pod \"redhat-operators-5f65l\" (UID: \"c9d24090-2cc9-40be-b14b-829288acfee7\") " pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.726838 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d24090-2cc9-40be-b14b-829288acfee7-catalog-content\") pod \"redhat-operators-5f65l\" (UID: \"c9d24090-2cc9-40be-b14b-829288acfee7\") " pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.828440 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d24090-2cc9-40be-b14b-829288acfee7-catalog-content\") pod \"redhat-operators-5f65l\" (UID: \"c9d24090-2cc9-40be-b14b-829288acfee7\") " pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.828587 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d24090-2cc9-40be-b14b-829288acfee7-utilities\") pod \"redhat-operators-5f65l\" (UID: \"c9d24090-2cc9-40be-b14b-829288acfee7\") " pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.828639 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6mjv\" (UniqueName: \"kubernetes.io/projected/c9d24090-2cc9-40be-b14b-829288acfee7-kube-api-access-c6mjv\") pod \"redhat-operators-5f65l\" (UID: \"c9d24090-2cc9-40be-b14b-829288acfee7\") " pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.828920 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d24090-2cc9-40be-b14b-829288acfee7-catalog-content\") pod \"redhat-operators-5f65l\" (UID: \"c9d24090-2cc9-40be-b14b-829288acfee7\") " pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.829266 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d24090-2cc9-40be-b14b-829288acfee7-utilities\") pod \"redhat-operators-5f65l\" (UID: \"c9d24090-2cc9-40be-b14b-829288acfee7\") " pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.847917 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6mjv\" (UniqueName: \"kubernetes.io/projected/c9d24090-2cc9-40be-b14b-829288acfee7-kube-api-access-c6mjv\") pod \"redhat-operators-5f65l\" (UID: \"c9d24090-2cc9-40be-b14b-829288acfee7\") " pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:43 crc kubenswrapper[4974]: I1013 19:00:43.959218 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:44 crc kubenswrapper[4974]: I1013 19:00:44.380848 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppzfn" event={"ID":"6f600a0a-5d64-464e-94cf-299ebe651364","Type":"ContainerStarted","Data":"91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d"} Oct 13 19:00:44 crc kubenswrapper[4974]: I1013 19:00:44.405761 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ppzfn" podStartSLOduration=2.96214467 podStartE2EDuration="6.405741162s" podCreationTimestamp="2025-10-13 19:00:38 +0000 UTC" firstStartedPulling="2025-10-13 19:00:40.333137339 +0000 UTC m=+2775.237503429" lastFinishedPulling="2025-10-13 19:00:43.776733841 +0000 UTC m=+2778.681099921" observedRunningTime="2025-10-13 19:00:44.399225159 +0000 UTC m=+2779.303591239" watchObservedRunningTime="2025-10-13 19:00:44.405741162 +0000 UTC m=+2779.310107242" Oct 13 19:00:44 crc kubenswrapper[4974]: W1013 19:00:44.447966 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9d24090_2cc9_40be_b14b_829288acfee7.slice/crio-5c882f9479e0550d5967494acb3ca0a660b5c12c8c7f1dfd51ca06e0fd89bc01 WatchSource:0}: Error finding container 5c882f9479e0550d5967494acb3ca0a660b5c12c8c7f1dfd51ca06e0fd89bc01: Status 404 returned error can't find the container with id 5c882f9479e0550d5967494acb3ca0a660b5c12c8c7f1dfd51ca06e0fd89bc01 Oct 13 19:00:44 crc kubenswrapper[4974]: I1013 19:00:44.448067 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5f65l"] Oct 13 19:00:45 crc kubenswrapper[4974]: I1013 19:00:45.391638 4974 generic.go:334] "Generic (PLEG): container finished" podID="c9d24090-2cc9-40be-b14b-829288acfee7" containerID="8adbfa5e8c0ab0247a6c4619a21e9b24da0e5b899ba199e101880cf57aada9bb" exitCode=0 Oct 13 19:00:45 crc kubenswrapper[4974]: I1013 19:00:45.392840 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f65l" event={"ID":"c9d24090-2cc9-40be-b14b-829288acfee7","Type":"ContainerDied","Data":"8adbfa5e8c0ab0247a6c4619a21e9b24da0e5b899ba199e101880cf57aada9bb"} Oct 13 19:00:45 crc kubenswrapper[4974]: I1013 19:00:45.392885 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f65l" event={"ID":"c9d24090-2cc9-40be-b14b-829288acfee7","Type":"ContainerStarted","Data":"5c882f9479e0550d5967494acb3ca0a660b5c12c8c7f1dfd51ca06e0fd89bc01"} Oct 13 19:00:47 crc kubenswrapper[4974]: I1013 19:00:47.419096 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f65l" event={"ID":"c9d24090-2cc9-40be-b14b-829288acfee7","Type":"ContainerStarted","Data":"db82bd5883513177eeafc9a243872cd02746787791965c7ae0cc11225e2b2c77"} Oct 13 19:00:49 crc kubenswrapper[4974]: I1013 19:00:49.053080 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:49 crc kubenswrapper[4974]: I1013 19:00:49.053448 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:49 crc kubenswrapper[4974]: I1013 19:00:49.128018 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:49 crc kubenswrapper[4974]: I1013 19:00:49.500851 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:49 crc kubenswrapper[4974]: I1013 19:00:49.933369 4974 scope.go:117] "RemoveContainer" containerID="88a68b5de9dd2fe749342e2a2bfa1c3733354311f4ea9f85bd3bcff1fa8c451e" Oct 13 19:00:50 crc kubenswrapper[4974]: I1013 19:00:50.454242 4974 generic.go:334] "Generic (PLEG): container finished" podID="c9d24090-2cc9-40be-b14b-829288acfee7" containerID="db82bd5883513177eeafc9a243872cd02746787791965c7ae0cc11225e2b2c77" exitCode=0 Oct 13 19:00:50 crc kubenswrapper[4974]: I1013 19:00:50.454291 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f65l" event={"ID":"c9d24090-2cc9-40be-b14b-829288acfee7","Type":"ContainerDied","Data":"db82bd5883513177eeafc9a243872cd02746787791965c7ae0cc11225e2b2c77"} Oct 13 19:00:51 crc kubenswrapper[4974]: I1013 19:00:51.468048 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f65l" event={"ID":"c9d24090-2cc9-40be-b14b-829288acfee7","Type":"ContainerStarted","Data":"6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b"} Oct 13 19:00:51 crc kubenswrapper[4974]: I1013 19:00:51.493020 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5f65l" podStartSLOduration=2.911268592 podStartE2EDuration="8.493002866s" podCreationTimestamp="2025-10-13 19:00:43 +0000 UTC" firstStartedPulling="2025-10-13 19:00:45.394976258 +0000 UTC m=+2780.299342338" lastFinishedPulling="2025-10-13 19:00:50.976710502 +0000 UTC m=+2785.881076612" observedRunningTime="2025-10-13 19:00:51.484404533 +0000 UTC m=+2786.388770623" watchObservedRunningTime="2025-10-13 19:00:51.493002866 +0000 UTC m=+2786.397368946" Oct 13 19:00:51 crc kubenswrapper[4974]: I1013 19:00:51.841426 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ppzfn"] Oct 13 19:00:51 crc kubenswrapper[4974]: I1013 19:00:51.841710 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ppzfn" podUID="6f600a0a-5d64-464e-94cf-299ebe651364" containerName="registry-server" containerID="cri-o://91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d" gracePeriod=2 Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.309256 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.407307 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f600a0a-5d64-464e-94cf-299ebe651364-catalog-content\") pod \"6f600a0a-5d64-464e-94cf-299ebe651364\" (UID: \"6f600a0a-5d64-464e-94cf-299ebe651364\") " Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.407480 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f600a0a-5d64-464e-94cf-299ebe651364-utilities\") pod \"6f600a0a-5d64-464e-94cf-299ebe651364\" (UID: \"6f600a0a-5d64-464e-94cf-299ebe651364\") " Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.407565 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml5c8\" (UniqueName: \"kubernetes.io/projected/6f600a0a-5d64-464e-94cf-299ebe651364-kube-api-access-ml5c8\") pod \"6f600a0a-5d64-464e-94cf-299ebe651364\" (UID: \"6f600a0a-5d64-464e-94cf-299ebe651364\") " Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.409609 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f600a0a-5d64-464e-94cf-299ebe651364-utilities" (OuterVolumeSpecName: "utilities") pod "6f600a0a-5d64-464e-94cf-299ebe651364" (UID: "6f600a0a-5d64-464e-94cf-299ebe651364"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.413035 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f600a0a-5d64-464e-94cf-299ebe651364-kube-api-access-ml5c8" (OuterVolumeSpecName: "kube-api-access-ml5c8") pod "6f600a0a-5d64-464e-94cf-299ebe651364" (UID: "6f600a0a-5d64-464e-94cf-299ebe651364"). InnerVolumeSpecName "kube-api-access-ml5c8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.461048 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f600a0a-5d64-464e-94cf-299ebe651364-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f600a0a-5d64-464e-94cf-299ebe651364" (UID: "6f600a0a-5d64-464e-94cf-299ebe651364"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.482850 4974 generic.go:334] "Generic (PLEG): container finished" podID="6f600a0a-5d64-464e-94cf-299ebe651364" containerID="91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d" exitCode=0 Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.482887 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppzfn" event={"ID":"6f600a0a-5d64-464e-94cf-299ebe651364","Type":"ContainerDied","Data":"91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d"} Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.482913 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ppzfn" event={"ID":"6f600a0a-5d64-464e-94cf-299ebe651364","Type":"ContainerDied","Data":"3cfa4e94589cbe536df70d5fa6965f43af346b97cb25ecf21259da49b7609d62"} Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.482933 4974 scope.go:117] "RemoveContainer" containerID="91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.483041 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ppzfn" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.522454 4974 scope.go:117] "RemoveContainer" containerID="6ba0e606910c65f437d751de7f4e59f56ce61e644c0cefa2faa5de91a8715740" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.523946 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f600a0a-5d64-464e-94cf-299ebe651364-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.523973 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml5c8\" (UniqueName: \"kubernetes.io/projected/6f600a0a-5d64-464e-94cf-299ebe651364-kube-api-access-ml5c8\") on node \"crc\" DevicePath \"\"" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.523987 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f600a0a-5d64-464e-94cf-299ebe651364-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.531042 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ppzfn"] Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.550097 4974 scope.go:117] "RemoveContainer" containerID="5a747ad674834a66b18cf35fbf585d20d59329b322da779d7c0592c804be8ab1" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.561911 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ppzfn"] Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.595764 4974 scope.go:117] "RemoveContainer" containerID="91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d" Oct 13 19:00:52 crc kubenswrapper[4974]: E1013 19:00:52.596219 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d\": container with ID starting with 91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d not found: ID does not exist" containerID="91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.596262 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d"} err="failed to get container status \"91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d\": rpc error: code = NotFound desc = could not find container \"91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d\": container with ID starting with 91e76e9f0797f510b9caf67b736f93c469a6872a4e1c85a3b04f2d597a4fed0d not found: ID does not exist" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.596292 4974 scope.go:117] "RemoveContainer" containerID="6ba0e606910c65f437d751de7f4e59f56ce61e644c0cefa2faa5de91a8715740" Oct 13 19:00:52 crc kubenswrapper[4974]: E1013 19:00:52.596863 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ba0e606910c65f437d751de7f4e59f56ce61e644c0cefa2faa5de91a8715740\": container with ID starting with 6ba0e606910c65f437d751de7f4e59f56ce61e644c0cefa2faa5de91a8715740 not found: ID does not exist" containerID="6ba0e606910c65f437d751de7f4e59f56ce61e644c0cefa2faa5de91a8715740" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.596891 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ba0e606910c65f437d751de7f4e59f56ce61e644c0cefa2faa5de91a8715740"} err="failed to get container status \"6ba0e606910c65f437d751de7f4e59f56ce61e644c0cefa2faa5de91a8715740\": rpc error: code = NotFound desc = could not find container \"6ba0e606910c65f437d751de7f4e59f56ce61e644c0cefa2faa5de91a8715740\": container with ID starting with 6ba0e606910c65f437d751de7f4e59f56ce61e644c0cefa2faa5de91a8715740 not found: ID does not exist" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.596911 4974 scope.go:117] "RemoveContainer" containerID="5a747ad674834a66b18cf35fbf585d20d59329b322da779d7c0592c804be8ab1" Oct 13 19:00:52 crc kubenswrapper[4974]: E1013 19:00:52.597249 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a747ad674834a66b18cf35fbf585d20d59329b322da779d7c0592c804be8ab1\": container with ID starting with 5a747ad674834a66b18cf35fbf585d20d59329b322da779d7c0592c804be8ab1 not found: ID does not exist" containerID="5a747ad674834a66b18cf35fbf585d20d59329b322da779d7c0592c804be8ab1" Oct 13 19:00:52 crc kubenswrapper[4974]: I1013 19:00:52.597333 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a747ad674834a66b18cf35fbf585d20d59329b322da779d7c0592c804be8ab1"} err="failed to get container status \"5a747ad674834a66b18cf35fbf585d20d59329b322da779d7c0592c804be8ab1\": rpc error: code = NotFound desc = could not find container \"5a747ad674834a66b18cf35fbf585d20d59329b322da779d7c0592c804be8ab1\": container with ID starting with 5a747ad674834a66b18cf35fbf585d20d59329b322da779d7c0592c804be8ab1 not found: ID does not exist" Oct 13 19:00:53 crc kubenswrapper[4974]: I1013 19:00:53.822813 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f600a0a-5d64-464e-94cf-299ebe651364" path="/var/lib/kubelet/pods/6f600a0a-5d64-464e-94cf-299ebe651364/volumes" Oct 13 19:00:53 crc kubenswrapper[4974]: I1013 19:00:53.959924 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:53 crc kubenswrapper[4974]: I1013 19:00:53.960196 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:00:55 crc kubenswrapper[4974]: I1013 19:00:55.043701 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5f65l" podUID="c9d24090-2cc9-40be-b14b-829288acfee7" containerName="registry-server" probeResult="failure" output=< Oct 13 19:00:55 crc kubenswrapper[4974]: timeout: failed to connect service ":50051" within 1s Oct 13 19:00:55 crc kubenswrapper[4974]: > Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.170792 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29339701-wr52s"] Oct 13 19:01:00 crc kubenswrapper[4974]: E1013 19:01:00.172022 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f600a0a-5d64-464e-94cf-299ebe651364" containerName="extract-utilities" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.172041 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f600a0a-5d64-464e-94cf-299ebe651364" containerName="extract-utilities" Oct 13 19:01:00 crc kubenswrapper[4974]: E1013 19:01:00.172067 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f600a0a-5d64-464e-94cf-299ebe651364" containerName="extract-content" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.172076 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f600a0a-5d64-464e-94cf-299ebe651364" containerName="extract-content" Oct 13 19:01:00 crc kubenswrapper[4974]: E1013 19:01:00.172123 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f600a0a-5d64-464e-94cf-299ebe651364" containerName="registry-server" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.172134 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f600a0a-5d64-464e-94cf-299ebe651364" containerName="registry-server" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.172419 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f600a0a-5d64-464e-94cf-299ebe651364" containerName="registry-server" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.173753 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.183853 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29339701-wr52s"] Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.301548 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfcjq\" (UniqueName: \"kubernetes.io/projected/d52747d1-422c-40d4-ae78-d45dafcf9cbf-kube-api-access-sfcjq\") pod \"keystone-cron-29339701-wr52s\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.301641 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-fernet-keys\") pod \"keystone-cron-29339701-wr52s\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.301710 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-combined-ca-bundle\") pod \"keystone-cron-29339701-wr52s\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.301751 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-config-data\") pod \"keystone-cron-29339701-wr52s\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.402910 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-fernet-keys\") pod \"keystone-cron-29339701-wr52s\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.403268 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-combined-ca-bundle\") pod \"keystone-cron-29339701-wr52s\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.403323 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-config-data\") pod \"keystone-cron-29339701-wr52s\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.403424 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfcjq\" (UniqueName: \"kubernetes.io/projected/d52747d1-422c-40d4-ae78-d45dafcf9cbf-kube-api-access-sfcjq\") pod \"keystone-cron-29339701-wr52s\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.408931 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-fernet-keys\") pod \"keystone-cron-29339701-wr52s\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.409459 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-config-data\") pod \"keystone-cron-29339701-wr52s\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.410543 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-combined-ca-bundle\") pod \"keystone-cron-29339701-wr52s\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.421465 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfcjq\" (UniqueName: \"kubernetes.io/projected/d52747d1-422c-40d4-ae78-d45dafcf9cbf-kube-api-access-sfcjq\") pod \"keystone-cron-29339701-wr52s\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:00 crc kubenswrapper[4974]: I1013 19:01:00.531486 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:01 crc kubenswrapper[4974]: I1013 19:01:01.018493 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29339701-wr52s"] Oct 13 19:01:01 crc kubenswrapper[4974]: W1013 19:01:01.028956 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd52747d1_422c_40d4_ae78_d45dafcf9cbf.slice/crio-302520205f71817fd5efdbfbd532544d0237c69e3da4a0bfafe156af7a5b2496 WatchSource:0}: Error finding container 302520205f71817fd5efdbfbd532544d0237c69e3da4a0bfafe156af7a5b2496: Status 404 returned error can't find the container with id 302520205f71817fd5efdbfbd532544d0237c69e3da4a0bfafe156af7a5b2496 Oct 13 19:01:01 crc kubenswrapper[4974]: I1013 19:01:01.570824 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339701-wr52s" event={"ID":"d52747d1-422c-40d4-ae78-d45dafcf9cbf","Type":"ContainerStarted","Data":"9495c25cc46fbae2004466f689fb408b3dc8328772d0a4ad9ef80749c52addb4"} Oct 13 19:01:01 crc kubenswrapper[4974]: I1013 19:01:01.570886 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339701-wr52s" event={"ID":"d52747d1-422c-40d4-ae78-d45dafcf9cbf","Type":"ContainerStarted","Data":"302520205f71817fd5efdbfbd532544d0237c69e3da4a0bfafe156af7a5b2496"} Oct 13 19:01:01 crc kubenswrapper[4974]: I1013 19:01:01.589618 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29339701-wr52s" podStartSLOduration=1.589588729 podStartE2EDuration="1.589588729s" podCreationTimestamp="2025-10-13 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 19:01:01.587097779 +0000 UTC m=+2796.491463879" watchObservedRunningTime="2025-10-13 19:01:01.589588729 +0000 UTC m=+2796.493954849" Oct 13 19:01:04 crc kubenswrapper[4974]: I1013 19:01:04.033554 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:01:04 crc kubenswrapper[4974]: I1013 19:01:04.097110 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:01:04 crc kubenswrapper[4974]: I1013 19:01:04.278508 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5f65l"] Oct 13 19:01:04 crc kubenswrapper[4974]: I1013 19:01:04.627952 4974 generic.go:334] "Generic (PLEG): container finished" podID="d52747d1-422c-40d4-ae78-d45dafcf9cbf" containerID="9495c25cc46fbae2004466f689fb408b3dc8328772d0a4ad9ef80749c52addb4" exitCode=0 Oct 13 19:01:04 crc kubenswrapper[4974]: I1013 19:01:04.628009 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339701-wr52s" event={"ID":"d52747d1-422c-40d4-ae78-d45dafcf9cbf","Type":"ContainerDied","Data":"9495c25cc46fbae2004466f689fb408b3dc8328772d0a4ad9ef80749c52addb4"} Oct 13 19:01:05 crc kubenswrapper[4974]: I1013 19:01:05.641594 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5f65l" podUID="c9d24090-2cc9-40be-b14b-829288acfee7" containerName="registry-server" containerID="cri-o://6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b" gracePeriod=2 Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.025382 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.133508 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-fernet-keys\") pod \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.133566 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfcjq\" (UniqueName: \"kubernetes.io/projected/d52747d1-422c-40d4-ae78-d45dafcf9cbf-kube-api-access-sfcjq\") pod \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.133595 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-combined-ca-bundle\") pod \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.133687 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-config-data\") pod \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\" (UID: \"d52747d1-422c-40d4-ae78-d45dafcf9cbf\") " Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.139217 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "d52747d1-422c-40d4-ae78-d45dafcf9cbf" (UID: "d52747d1-422c-40d4-ae78-d45dafcf9cbf"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.140802 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d52747d1-422c-40d4-ae78-d45dafcf9cbf-kube-api-access-sfcjq" (OuterVolumeSpecName: "kube-api-access-sfcjq") pod "d52747d1-422c-40d4-ae78-d45dafcf9cbf" (UID: "d52747d1-422c-40d4-ae78-d45dafcf9cbf"). InnerVolumeSpecName "kube-api-access-sfcjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.153072 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.163748 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d52747d1-422c-40d4-ae78-d45dafcf9cbf" (UID: "d52747d1-422c-40d4-ae78-d45dafcf9cbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.215041 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-config-data" (OuterVolumeSpecName: "config-data") pod "d52747d1-422c-40d4-ae78-d45dafcf9cbf" (UID: "d52747d1-422c-40d4-ae78-d45dafcf9cbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.236476 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d24090-2cc9-40be-b14b-829288acfee7-utilities\") pod \"c9d24090-2cc9-40be-b14b-829288acfee7\" (UID: \"c9d24090-2cc9-40be-b14b-829288acfee7\") " Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.236561 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d24090-2cc9-40be-b14b-829288acfee7-catalog-content\") pod \"c9d24090-2cc9-40be-b14b-829288acfee7\" (UID: \"c9d24090-2cc9-40be-b14b-829288acfee7\") " Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.236671 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6mjv\" (UniqueName: \"kubernetes.io/projected/c9d24090-2cc9-40be-b14b-829288acfee7-kube-api-access-c6mjv\") pod \"c9d24090-2cc9-40be-b14b-829288acfee7\" (UID: \"c9d24090-2cc9-40be-b14b-829288acfee7\") " Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.237218 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d24090-2cc9-40be-b14b-829288acfee7-utilities" (OuterVolumeSpecName: "utilities") pod "c9d24090-2cc9-40be-b14b-829288acfee7" (UID: "c9d24090-2cc9-40be-b14b-829288acfee7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.237783 4974 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.237802 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfcjq\" (UniqueName: \"kubernetes.io/projected/d52747d1-422c-40d4-ae78-d45dafcf9cbf-kube-api-access-sfcjq\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.237817 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.237826 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9d24090-2cc9-40be-b14b-829288acfee7-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.237833 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d52747d1-422c-40d4-ae78-d45dafcf9cbf-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.240323 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d24090-2cc9-40be-b14b-829288acfee7-kube-api-access-c6mjv" (OuterVolumeSpecName: "kube-api-access-c6mjv") pod "c9d24090-2cc9-40be-b14b-829288acfee7" (UID: "c9d24090-2cc9-40be-b14b-829288acfee7"). InnerVolumeSpecName "kube-api-access-c6mjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.322452 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d24090-2cc9-40be-b14b-829288acfee7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9d24090-2cc9-40be-b14b-829288acfee7" (UID: "c9d24090-2cc9-40be-b14b-829288acfee7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.340398 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9d24090-2cc9-40be-b14b-829288acfee7-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.340458 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6mjv\" (UniqueName: \"kubernetes.io/projected/c9d24090-2cc9-40be-b14b-829288acfee7-kube-api-access-c6mjv\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.654516 4974 generic.go:334] "Generic (PLEG): container finished" podID="c9d24090-2cc9-40be-b14b-829288acfee7" containerID="6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b" exitCode=0 Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.654555 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f65l" event={"ID":"c9d24090-2cc9-40be-b14b-829288acfee7","Type":"ContainerDied","Data":"6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b"} Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.654609 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5f65l" event={"ID":"c9d24090-2cc9-40be-b14b-829288acfee7","Type":"ContainerDied","Data":"5c882f9479e0550d5967494acb3ca0a660b5c12c8c7f1dfd51ca06e0fd89bc01"} Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.654627 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5f65l" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.654642 4974 scope.go:117] "RemoveContainer" containerID="6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.656897 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339701-wr52s" event={"ID":"d52747d1-422c-40d4-ae78-d45dafcf9cbf","Type":"ContainerDied","Data":"302520205f71817fd5efdbfbd532544d0237c69e3da4a0bfafe156af7a5b2496"} Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.656934 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="302520205f71817fd5efdbfbd532544d0237c69e3da4a0bfafe156af7a5b2496" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.657114 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339701-wr52s" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.700193 4974 scope.go:117] "RemoveContainer" containerID="db82bd5883513177eeafc9a243872cd02746787791965c7ae0cc11225e2b2c77" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.727401 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5f65l"] Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.737346 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5f65l"] Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.751071 4974 scope.go:117] "RemoveContainer" containerID="8adbfa5e8c0ab0247a6c4619a21e9b24da0e5b899ba199e101880cf57aada9bb" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.782830 4974 scope.go:117] "RemoveContainer" containerID="6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b" Oct 13 19:01:06 crc kubenswrapper[4974]: E1013 19:01:06.783572 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b\": container with ID starting with 6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b not found: ID does not exist" containerID="6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.783611 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b"} err="failed to get container status \"6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b\": rpc error: code = NotFound desc = could not find container \"6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b\": container with ID starting with 6c2a5493498a81c3841d4717b344a6d2d7036f61692fbd21f6f368a45232f78b not found: ID does not exist" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.783640 4974 scope.go:117] "RemoveContainer" containerID="db82bd5883513177eeafc9a243872cd02746787791965c7ae0cc11225e2b2c77" Oct 13 19:01:06 crc kubenswrapper[4974]: E1013 19:01:06.784071 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db82bd5883513177eeafc9a243872cd02746787791965c7ae0cc11225e2b2c77\": container with ID starting with db82bd5883513177eeafc9a243872cd02746787791965c7ae0cc11225e2b2c77 not found: ID does not exist" containerID="db82bd5883513177eeafc9a243872cd02746787791965c7ae0cc11225e2b2c77" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.784132 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db82bd5883513177eeafc9a243872cd02746787791965c7ae0cc11225e2b2c77"} err="failed to get container status \"db82bd5883513177eeafc9a243872cd02746787791965c7ae0cc11225e2b2c77\": rpc error: code = NotFound desc = could not find container \"db82bd5883513177eeafc9a243872cd02746787791965c7ae0cc11225e2b2c77\": container with ID starting with db82bd5883513177eeafc9a243872cd02746787791965c7ae0cc11225e2b2c77 not found: ID does not exist" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.784167 4974 scope.go:117] "RemoveContainer" containerID="8adbfa5e8c0ab0247a6c4619a21e9b24da0e5b899ba199e101880cf57aada9bb" Oct 13 19:01:06 crc kubenswrapper[4974]: E1013 19:01:06.784629 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8adbfa5e8c0ab0247a6c4619a21e9b24da0e5b899ba199e101880cf57aada9bb\": container with ID starting with 8adbfa5e8c0ab0247a6c4619a21e9b24da0e5b899ba199e101880cf57aada9bb not found: ID does not exist" containerID="8adbfa5e8c0ab0247a6c4619a21e9b24da0e5b899ba199e101880cf57aada9bb" Oct 13 19:01:06 crc kubenswrapper[4974]: I1013 19:01:06.784714 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8adbfa5e8c0ab0247a6c4619a21e9b24da0e5b899ba199e101880cf57aada9bb"} err="failed to get container status \"8adbfa5e8c0ab0247a6c4619a21e9b24da0e5b899ba199e101880cf57aada9bb\": rpc error: code = NotFound desc = could not find container \"8adbfa5e8c0ab0247a6c4619a21e9b24da0e5b899ba199e101880cf57aada9bb\": container with ID starting with 8adbfa5e8c0ab0247a6c4619a21e9b24da0e5b899ba199e101880cf57aada9bb not found: ID does not exist" Oct 13 19:01:07 crc kubenswrapper[4974]: I1013 19:01:07.825728 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d24090-2cc9-40be-b14b-829288acfee7" path="/var/lib/kubelet/pods/c9d24090-2cc9-40be-b14b-829288acfee7/volumes" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.213712 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pzpp2"] Oct 13 19:01:21 crc kubenswrapper[4974]: E1013 19:01:21.214596 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d24090-2cc9-40be-b14b-829288acfee7" containerName="registry-server" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.214609 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d24090-2cc9-40be-b14b-829288acfee7" containerName="registry-server" Oct 13 19:01:21 crc kubenswrapper[4974]: E1013 19:01:21.214626 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d24090-2cc9-40be-b14b-829288acfee7" containerName="extract-utilities" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.214632 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d24090-2cc9-40be-b14b-829288acfee7" containerName="extract-utilities" Oct 13 19:01:21 crc kubenswrapper[4974]: E1013 19:01:21.214688 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d24090-2cc9-40be-b14b-829288acfee7" containerName="extract-content" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.214700 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d24090-2cc9-40be-b14b-829288acfee7" containerName="extract-content" Oct 13 19:01:21 crc kubenswrapper[4974]: E1013 19:01:21.214724 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d52747d1-422c-40d4-ae78-d45dafcf9cbf" containerName="keystone-cron" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.214730 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="d52747d1-422c-40d4-ae78-d45dafcf9cbf" containerName="keystone-cron" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.214935 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="d52747d1-422c-40d4-ae78-d45dafcf9cbf" containerName="keystone-cron" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.214965 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d24090-2cc9-40be-b14b-829288acfee7" containerName="registry-server" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.216366 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.222423 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzpp2"] Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.371315 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7334a1-7602-4ca8-bf35-0a3da10539ff-catalog-content\") pod \"community-operators-pzpp2\" (UID: \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\") " pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.371641 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7334a1-7602-4ca8-bf35-0a3da10539ff-utilities\") pod \"community-operators-pzpp2\" (UID: \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\") " pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.371745 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74pd2\" (UniqueName: \"kubernetes.io/projected/0b7334a1-7602-4ca8-bf35-0a3da10539ff-kube-api-access-74pd2\") pod \"community-operators-pzpp2\" (UID: \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\") " pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.474317 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7334a1-7602-4ca8-bf35-0a3da10539ff-utilities\") pod \"community-operators-pzpp2\" (UID: \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\") " pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.474970 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74pd2\" (UniqueName: \"kubernetes.io/projected/0b7334a1-7602-4ca8-bf35-0a3da10539ff-kube-api-access-74pd2\") pod \"community-operators-pzpp2\" (UID: \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\") " pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.475100 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7334a1-7602-4ca8-bf35-0a3da10539ff-catalog-content\") pod \"community-operators-pzpp2\" (UID: \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\") " pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.474883 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7334a1-7602-4ca8-bf35-0a3da10539ff-utilities\") pod \"community-operators-pzpp2\" (UID: \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\") " pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.475916 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7334a1-7602-4ca8-bf35-0a3da10539ff-catalog-content\") pod \"community-operators-pzpp2\" (UID: \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\") " pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.493508 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74pd2\" (UniqueName: \"kubernetes.io/projected/0b7334a1-7602-4ca8-bf35-0a3da10539ff-kube-api-access-74pd2\") pod \"community-operators-pzpp2\" (UID: \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\") " pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:21 crc kubenswrapper[4974]: I1013 19:01:21.544300 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:22 crc kubenswrapper[4974]: I1013 19:01:22.007609 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pzpp2"] Oct 13 19:01:22 crc kubenswrapper[4974]: I1013 19:01:22.855942 4974 generic.go:334] "Generic (PLEG): container finished" podID="0b7334a1-7602-4ca8-bf35-0a3da10539ff" containerID="52c311005e24cff8a50f1315c1f96df26711e7d21be8d5c064e0cf7677a47f22" exitCode=0 Oct 13 19:01:22 crc kubenswrapper[4974]: I1013 19:01:22.858105 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpp2" event={"ID":"0b7334a1-7602-4ca8-bf35-0a3da10539ff","Type":"ContainerDied","Data":"52c311005e24cff8a50f1315c1f96df26711e7d21be8d5c064e0cf7677a47f22"} Oct 13 19:01:22 crc kubenswrapper[4974]: I1013 19:01:22.858387 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpp2" event={"ID":"0b7334a1-7602-4ca8-bf35-0a3da10539ff","Type":"ContainerStarted","Data":"516ae5fba09c3129573d611f6da70fcdf5591101d365236369970827a9623517"} Oct 13 19:01:24 crc kubenswrapper[4974]: I1013 19:01:24.880142 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpp2" event={"ID":"0b7334a1-7602-4ca8-bf35-0a3da10539ff","Type":"ContainerStarted","Data":"f6c4edb386d43e86dab1a6c5001102382dc18b48e0c8311345a7294a7602aacc"} Oct 13 19:01:25 crc kubenswrapper[4974]: I1013 19:01:25.892183 4974 generic.go:334] "Generic (PLEG): container finished" podID="0b7334a1-7602-4ca8-bf35-0a3da10539ff" containerID="f6c4edb386d43e86dab1a6c5001102382dc18b48e0c8311345a7294a7602aacc" exitCode=0 Oct 13 19:01:25 crc kubenswrapper[4974]: I1013 19:01:25.892237 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpp2" event={"ID":"0b7334a1-7602-4ca8-bf35-0a3da10539ff","Type":"ContainerDied","Data":"f6c4edb386d43e86dab1a6c5001102382dc18b48e0c8311345a7294a7602aacc"} Oct 13 19:01:26 crc kubenswrapper[4974]: I1013 19:01:26.911816 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpp2" event={"ID":"0b7334a1-7602-4ca8-bf35-0a3da10539ff","Type":"ContainerStarted","Data":"8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4"} Oct 13 19:01:26 crc kubenswrapper[4974]: I1013 19:01:26.942568 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pzpp2" podStartSLOduration=2.436013859 podStartE2EDuration="5.942530984s" podCreationTimestamp="2025-10-13 19:01:21 +0000 UTC" firstStartedPulling="2025-10-13 19:01:22.863161291 +0000 UTC m=+2817.767527381" lastFinishedPulling="2025-10-13 19:01:26.369678406 +0000 UTC m=+2821.274044506" observedRunningTime="2025-10-13 19:01:26.938430709 +0000 UTC m=+2821.842796809" watchObservedRunningTime="2025-10-13 19:01:26.942530984 +0000 UTC m=+2821.846897104" Oct 13 19:01:31 crc kubenswrapper[4974]: I1013 19:01:31.544857 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:31 crc kubenswrapper[4974]: I1013 19:01:31.545782 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:31 crc kubenswrapper[4974]: I1013 19:01:31.626331 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:32 crc kubenswrapper[4974]: I1013 19:01:32.018723 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:32 crc kubenswrapper[4974]: I1013 19:01:32.074356 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzpp2"] Oct 13 19:01:33 crc kubenswrapper[4974]: I1013 19:01:33.994706 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pzpp2" podUID="0b7334a1-7602-4ca8-bf35-0a3da10539ff" containerName="registry-server" containerID="cri-o://8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4" gracePeriod=2 Oct 13 19:01:34 crc kubenswrapper[4974]: I1013 19:01:34.465248 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:34 crc kubenswrapper[4974]: I1013 19:01:34.511546 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7334a1-7602-4ca8-bf35-0a3da10539ff-catalog-content\") pod \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\" (UID: \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\") " Oct 13 19:01:34 crc kubenswrapper[4974]: I1013 19:01:34.566443 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b7334a1-7602-4ca8-bf35-0a3da10539ff-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0b7334a1-7602-4ca8-bf35-0a3da10539ff" (UID: "0b7334a1-7602-4ca8-bf35-0a3da10539ff"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:01:34 crc kubenswrapper[4974]: I1013 19:01:34.613148 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7334a1-7602-4ca8-bf35-0a3da10539ff-utilities\") pod \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\" (UID: \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\") " Oct 13 19:01:34 crc kubenswrapper[4974]: I1013 19:01:34.613300 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-74pd2\" (UniqueName: \"kubernetes.io/projected/0b7334a1-7602-4ca8-bf35-0a3da10539ff-kube-api-access-74pd2\") pod \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\" (UID: \"0b7334a1-7602-4ca8-bf35-0a3da10539ff\") " Oct 13 19:01:34 crc kubenswrapper[4974]: I1013 19:01:34.613925 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0b7334a1-7602-4ca8-bf35-0a3da10539ff-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:34 crc kubenswrapper[4974]: I1013 19:01:34.614028 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b7334a1-7602-4ca8-bf35-0a3da10539ff-utilities" (OuterVolumeSpecName: "utilities") pod "0b7334a1-7602-4ca8-bf35-0a3da10539ff" (UID: "0b7334a1-7602-4ca8-bf35-0a3da10539ff"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:01:34 crc kubenswrapper[4974]: I1013 19:01:34.627232 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b7334a1-7602-4ca8-bf35-0a3da10539ff-kube-api-access-74pd2" (OuterVolumeSpecName: "kube-api-access-74pd2") pod "0b7334a1-7602-4ca8-bf35-0a3da10539ff" (UID: "0b7334a1-7602-4ca8-bf35-0a3da10539ff"). InnerVolumeSpecName "kube-api-access-74pd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:01:34 crc kubenswrapper[4974]: I1013 19:01:34.715253 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-74pd2\" (UniqueName: \"kubernetes.io/projected/0b7334a1-7602-4ca8-bf35-0a3da10539ff-kube-api-access-74pd2\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:34 crc kubenswrapper[4974]: I1013 19:01:34.715639 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0b7334a1-7602-4ca8-bf35-0a3da10539ff-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.007131 4974 generic.go:334] "Generic (PLEG): container finished" podID="0b7334a1-7602-4ca8-bf35-0a3da10539ff" containerID="8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4" exitCode=0 Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.007177 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpp2" event={"ID":"0b7334a1-7602-4ca8-bf35-0a3da10539ff","Type":"ContainerDied","Data":"8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4"} Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.007208 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pzpp2" event={"ID":"0b7334a1-7602-4ca8-bf35-0a3da10539ff","Type":"ContainerDied","Data":"516ae5fba09c3129573d611f6da70fcdf5591101d365236369970827a9623517"} Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.007225 4974 scope.go:117] "RemoveContainer" containerID="8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4" Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.007384 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pzpp2" Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.033850 4974 scope.go:117] "RemoveContainer" containerID="f6c4edb386d43e86dab1a6c5001102382dc18b48e0c8311345a7294a7602aacc" Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.052390 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pzpp2"] Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.065816 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pzpp2"] Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.070142 4974 scope.go:117] "RemoveContainer" containerID="52c311005e24cff8a50f1315c1f96df26711e7d21be8d5c064e0cf7677a47f22" Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.142402 4974 scope.go:117] "RemoveContainer" containerID="8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4" Oct 13 19:01:35 crc kubenswrapper[4974]: E1013 19:01:35.142881 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4\": container with ID starting with 8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4 not found: ID does not exist" containerID="8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4" Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.142921 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4"} err="failed to get container status \"8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4\": rpc error: code = NotFound desc = could not find container \"8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4\": container with ID starting with 8008f8b4a33a3f585a7feb4df7a31fe9d97d91a931e8ac111e0a1bc79a2abea4 not found: ID does not exist" Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.142947 4974 scope.go:117] "RemoveContainer" containerID="f6c4edb386d43e86dab1a6c5001102382dc18b48e0c8311345a7294a7602aacc" Oct 13 19:01:35 crc kubenswrapper[4974]: E1013 19:01:35.143477 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6c4edb386d43e86dab1a6c5001102382dc18b48e0c8311345a7294a7602aacc\": container with ID starting with f6c4edb386d43e86dab1a6c5001102382dc18b48e0c8311345a7294a7602aacc not found: ID does not exist" containerID="f6c4edb386d43e86dab1a6c5001102382dc18b48e0c8311345a7294a7602aacc" Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.143521 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6c4edb386d43e86dab1a6c5001102382dc18b48e0c8311345a7294a7602aacc"} err="failed to get container status \"f6c4edb386d43e86dab1a6c5001102382dc18b48e0c8311345a7294a7602aacc\": rpc error: code = NotFound desc = could not find container \"f6c4edb386d43e86dab1a6c5001102382dc18b48e0c8311345a7294a7602aacc\": container with ID starting with f6c4edb386d43e86dab1a6c5001102382dc18b48e0c8311345a7294a7602aacc not found: ID does not exist" Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.143555 4974 scope.go:117] "RemoveContainer" containerID="52c311005e24cff8a50f1315c1f96df26711e7d21be8d5c064e0cf7677a47f22" Oct 13 19:01:35 crc kubenswrapper[4974]: E1013 19:01:35.144027 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52c311005e24cff8a50f1315c1f96df26711e7d21be8d5c064e0cf7677a47f22\": container with ID starting with 52c311005e24cff8a50f1315c1f96df26711e7d21be8d5c064e0cf7677a47f22 not found: ID does not exist" containerID="52c311005e24cff8a50f1315c1f96df26711e7d21be8d5c064e0cf7677a47f22" Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.144060 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52c311005e24cff8a50f1315c1f96df26711e7d21be8d5c064e0cf7677a47f22"} err="failed to get container status \"52c311005e24cff8a50f1315c1f96df26711e7d21be8d5c064e0cf7677a47f22\": rpc error: code = NotFound desc = could not find container \"52c311005e24cff8a50f1315c1f96df26711e7d21be8d5c064e0cf7677a47f22\": container with ID starting with 52c311005e24cff8a50f1315c1f96df26711e7d21be8d5c064e0cf7677a47f22 not found: ID does not exist" Oct 13 19:01:35 crc kubenswrapper[4974]: I1013 19:01:35.827974 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b7334a1-7602-4ca8-bf35-0a3da10539ff" path="/var/lib/kubelet/pods/0b7334a1-7602-4ca8-bf35-0a3da10539ff/volumes" Oct 13 19:01:44 crc kubenswrapper[4974]: I1013 19:01:44.107101 4974 generic.go:334] "Generic (PLEG): container finished" podID="a947ab95-2720-4cda-a618-470943b7443c" containerID="a5c1ef7ccc98b39626619aac357fa5ff8fee5bf58cc461d6337a7d502ced2094" exitCode=0 Oct 13 19:01:44 crc kubenswrapper[4974]: I1013 19:01:44.107233 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" event={"ID":"a947ab95-2720-4cda-a618-470943b7443c","Type":"ContainerDied","Data":"a5c1ef7ccc98b39626619aac357fa5ff8fee5bf58cc461d6337a7d502ced2094"} Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.664550 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.802368 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-inventory\") pod \"a947ab95-2720-4cda-a618-470943b7443c\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.803080 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdcmm\" (UniqueName: \"kubernetes.io/projected/a947ab95-2720-4cda-a618-470943b7443c-kube-api-access-bdcmm\") pod \"a947ab95-2720-4cda-a618-470943b7443c\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.803102 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-0\") pod \"a947ab95-2720-4cda-a618-470943b7443c\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.803225 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-1\") pod \"a947ab95-2720-4cda-a618-470943b7443c\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.803304 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ssh-key\") pod \"a947ab95-2720-4cda-a618-470943b7443c\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.803325 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-2\") pod \"a947ab95-2720-4cda-a618-470943b7443c\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.803367 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-telemetry-combined-ca-bundle\") pod \"a947ab95-2720-4cda-a618-470943b7443c\" (UID: \"a947ab95-2720-4cda-a618-470943b7443c\") " Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.808562 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "a947ab95-2720-4cda-a618-470943b7443c" (UID: "a947ab95-2720-4cda-a618-470943b7443c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.809715 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a947ab95-2720-4cda-a618-470943b7443c-kube-api-access-bdcmm" (OuterVolumeSpecName: "kube-api-access-bdcmm") pod "a947ab95-2720-4cda-a618-470943b7443c" (UID: "a947ab95-2720-4cda-a618-470943b7443c"). InnerVolumeSpecName "kube-api-access-bdcmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.831873 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "a947ab95-2720-4cda-a618-470943b7443c" (UID: "a947ab95-2720-4cda-a618-470943b7443c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.833372 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "a947ab95-2720-4cda-a618-470943b7443c" (UID: "a947ab95-2720-4cda-a618-470943b7443c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.849474 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "a947ab95-2720-4cda-a618-470943b7443c" (UID: "a947ab95-2720-4cda-a618-470943b7443c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.851005 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "a947ab95-2720-4cda-a618-470943b7443c" (UID: "a947ab95-2720-4cda-a618-470943b7443c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.872891 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-inventory" (OuterVolumeSpecName: "inventory") pod "a947ab95-2720-4cda-a618-470943b7443c" (UID: "a947ab95-2720-4cda-a618-470943b7443c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.905258 4974 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.905279 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.905289 4974 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.905306 4974 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.905318 4974 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-inventory\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.905326 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdcmm\" (UniqueName: \"kubernetes.io/projected/a947ab95-2720-4cda-a618-470943b7443c-kube-api-access-bdcmm\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:45 crc kubenswrapper[4974]: I1013 19:01:45.905333 4974 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/a947ab95-2720-4cda-a618-470943b7443c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Oct 13 19:01:46 crc kubenswrapper[4974]: I1013 19:01:46.164336 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" event={"ID":"a947ab95-2720-4cda-a618-470943b7443c","Type":"ContainerDied","Data":"9b80ef27c737abc2b27c77b84408c0e984e0645f33e683ded28fb7fddae415fc"} Oct 13 19:01:46 crc kubenswrapper[4974]: I1013 19:01:46.164398 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b80ef27c737abc2b27c77b84408c0e984e0645f33e683ded28fb7fddae415fc" Oct 13 19:01:46 crc kubenswrapper[4974]: I1013 19:01:46.164496 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr" Oct 13 19:02:07 crc kubenswrapper[4974]: I1013 19:02:07.742863 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:02:07 crc kubenswrapper[4974]: I1013 19:02:07.743768 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.893036 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Oct 13 19:02:22 crc kubenswrapper[4974]: E1013 19:02:22.894142 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7334a1-7602-4ca8-bf35-0a3da10539ff" containerName="extract-content" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.894160 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7334a1-7602-4ca8-bf35-0a3da10539ff" containerName="extract-content" Oct 13 19:02:22 crc kubenswrapper[4974]: E1013 19:02:22.894178 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a947ab95-2720-4cda-a618-470943b7443c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.894188 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="a947ab95-2720-4cda-a618-470943b7443c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 13 19:02:22 crc kubenswrapper[4974]: E1013 19:02:22.894208 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7334a1-7602-4ca8-bf35-0a3da10539ff" containerName="registry-server" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.894215 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7334a1-7602-4ca8-bf35-0a3da10539ff" containerName="registry-server" Oct 13 19:02:22 crc kubenswrapper[4974]: E1013 19:02:22.894233 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b7334a1-7602-4ca8-bf35-0a3da10539ff" containerName="extract-utilities" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.894240 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b7334a1-7602-4ca8-bf35-0a3da10539ff" containerName="extract-utilities" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.894436 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b7334a1-7602-4ca8-bf35-0a3da10539ff" containerName="registry-server" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.894457 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="a947ab95-2720-4cda-a618-470943b7443c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.895737 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.906136 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.908268 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.928707 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.928757 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.928785 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-config-data\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.928833 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-lib-modules\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.928894 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-run\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.928918 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.928938 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.928952 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.928969 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-config-data-custom\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.928983 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-scripts\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.929028 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.929191 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-dev\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.929276 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w9rs\" (UniqueName: \"kubernetes.io/projected/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-kube-api-access-7w9rs\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.929373 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-sys\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.929417 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-etc-nvme\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.992763 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-0"] Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.994834 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:22 crc kubenswrapper[4974]: I1013 19:02:22.996997 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-config-data" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.006597 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.032607 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.032718 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.032774 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-dev\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.032806 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-dev\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.032835 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w9rs\" (UniqueName: \"kubernetes.io/projected/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-kube-api-access-7w9rs\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.032883 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.032923 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-sys\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.032967 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.033022 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-etc-nvme\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.033189 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.033238 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.033277 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.033328 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-config-data\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.034139 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.034195 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-sys\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.034293 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-etc-nvme\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.034325 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-dev\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.033364 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-run\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.034858 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.034937 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.034952 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-lib-modules\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035003 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035044 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qbcw\" (UniqueName: \"kubernetes.io/projected/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-kube-api-access-9qbcw\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035068 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-sys\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035144 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035169 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035224 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-run\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035247 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035274 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035319 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035345 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035369 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-config-data-custom\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035388 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-scripts\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035433 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035468 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035711 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-lib-modules\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035806 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-run\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.035964 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.036303 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.039164 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.059670 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-config-data-custom\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.059886 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-config-data\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.060766 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.064040 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w9rs\" (UniqueName: \"kubernetes.io/projected/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-kube-api-access-7w9rs\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.065091 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57963e15-5fff-4158-ad83-0e4bd2ca1f7f-scripts\") pod \"cinder-backup-0\" (UID: \"57963e15-5fff-4158-ad83-0e4bd2ca1f7f\") " pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.093380 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.095077 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.097820 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-nfs-2-config-data" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.122462 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137483 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137538 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688ac6ef-82eb-421b-9949-a832b8a73319-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137562 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137582 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-sys\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137602 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qbcw\" (UniqueName: \"kubernetes.io/projected/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-kube-api-access-9qbcw\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137625 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137676 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137693 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137691 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-etc-nvme\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137725 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137823 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137876 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137910 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137937 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.137973 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.138047 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.138110 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.138137 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-dev\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.138210 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.138221 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-sys\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.138334 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-var-locks-brick\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.138406 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-dev\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.138577 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-etc-iscsi\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.138836 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.138905 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-var-lib-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.139041 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688ac6ef-82eb-421b-9949-a832b8a73319-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.139188 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.139297 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.139392 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/688ac6ef-82eb-421b-9949-a832b8a73319-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.139494 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.139614 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-run\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.139741 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhtcl\" (UniqueName: \"kubernetes.io/projected/688ac6ef-82eb-421b-9949-a832b8a73319-kube-api-access-qhtcl\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.139843 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.139978 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.140092 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688ac6ef-82eb-421b-9949-a832b8a73319-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.140196 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.139078 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-var-locks-cinder\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.140542 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-etc-machine-id\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.140703 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-run\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.140844 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-lib-modules\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.142310 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-config-data-custom\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.142356 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-config-data\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.143571 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-combined-ca-bundle\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.144875 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-scripts\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.154089 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qbcw\" (UniqueName: \"kubernetes.io/projected/90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e-kube-api-access-9qbcw\") pod \"cinder-volume-nfs-0\" (UID: \"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e\") " pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.224076 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.242602 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.242708 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688ac6ef-82eb-421b-9949-a832b8a73319-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.242718 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-sys\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.242755 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.242789 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/688ac6ef-82eb-421b-9949-a832b8a73319-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.242815 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.242844 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhtcl\" (UniqueName: \"kubernetes.io/projected/688ac6ef-82eb-421b-9949-a832b8a73319-kube-api-access-qhtcl\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.242868 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.242916 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688ac6ef-82eb-421b-9949-a832b8a73319-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.242944 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.242987 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688ac6ef-82eb-421b-9949-a832b8a73319-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243015 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243046 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243124 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243163 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243197 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243295 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-dev\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243317 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-etc-machine-id\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243362 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-var-locks-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243391 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-var-locks-brick\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243422 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-var-lib-cinder\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243455 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-lib-modules\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243491 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-etc-iscsi\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243526 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-run\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.243674 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/688ac6ef-82eb-421b-9949-a832b8a73319-etc-nvme\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.247614 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/688ac6ef-82eb-421b-9949-a832b8a73319-combined-ca-bundle\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.247943 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/688ac6ef-82eb-421b-9949-a832b8a73319-config-data-custom\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.248639 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/688ac6ef-82eb-421b-9949-a832b8a73319-config-data\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.257185 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/688ac6ef-82eb-421b-9949-a832b8a73319-scripts\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.266213 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhtcl\" (UniqueName: \"kubernetes.io/projected/688ac6ef-82eb-421b-9949-a832b8a73319-kube-api-access-qhtcl\") pod \"cinder-volume-nfs-2-0\" (UID: \"688ac6ef-82eb-421b-9949-a832b8a73319\") " pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.315851 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.318944 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.869465 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Oct 13 19:02:23 crc kubenswrapper[4974]: I1013 19:02:23.982941 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-0"] Oct 13 19:02:24 crc kubenswrapper[4974]: I1013 19:02:24.064424 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-nfs-2-0"] Oct 13 19:02:24 crc kubenswrapper[4974]: W1013 19:02:24.065324 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90cb7cf6_4e7a_4583_a5d5_4ce3d0a3c16e.slice/crio-95246defb144805a49a4b2529198e53acc02bdeecba580c65400bac67258c04a WatchSource:0}: Error finding container 95246defb144805a49a4b2529198e53acc02bdeecba580c65400bac67258c04a: Status 404 returned error can't find the container with id 95246defb144805a49a4b2529198e53acc02bdeecba580c65400bac67258c04a Oct 13 19:02:24 crc kubenswrapper[4974]: W1013 19:02:24.073929 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod688ac6ef_82eb_421b_9949_a832b8a73319.slice/crio-b7d621443a171123d2fc9df30ee169f9a7ab73879d9820470804c61ca0f0d40e WatchSource:0}: Error finding container b7d621443a171123d2fc9df30ee169f9a7ab73879d9820470804c61ca0f0d40e: Status 404 returned error can't find the container with id b7d621443a171123d2fc9df30ee169f9a7ab73879d9820470804c61ca0f0d40e Oct 13 19:02:24 crc kubenswrapper[4974]: I1013 19:02:24.607819 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e","Type":"ContainerStarted","Data":"cb494ac73d0635190ad235ec0a0975de23e262a91b5a0e0e406037286b6e3439"} Oct 13 19:02:24 crc kubenswrapper[4974]: I1013 19:02:24.608407 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e","Type":"ContainerStarted","Data":"95246defb144805a49a4b2529198e53acc02bdeecba580c65400bac67258c04a"} Oct 13 19:02:24 crc kubenswrapper[4974]: I1013 19:02:24.613366 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"57963e15-5fff-4158-ad83-0e4bd2ca1f7f","Type":"ContainerStarted","Data":"6b65c3a2e45114e28c8c9f8f34d9049f83c870eff655624591008e744b5a0037"} Oct 13 19:02:24 crc kubenswrapper[4974]: I1013 19:02:24.613400 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"57963e15-5fff-4158-ad83-0e4bd2ca1f7f","Type":"ContainerStarted","Data":"433de28865499b7d3d827784b0c205a9834c489c804d5f00045345e068068e98"} Oct 13 19:02:24 crc kubenswrapper[4974]: I1013 19:02:24.621700 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"688ac6ef-82eb-421b-9949-a832b8a73319","Type":"ContainerStarted","Data":"b7d621443a171123d2fc9df30ee169f9a7ab73879d9820470804c61ca0f0d40e"} Oct 13 19:02:25 crc kubenswrapper[4974]: I1013 19:02:25.632964 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"688ac6ef-82eb-421b-9949-a832b8a73319","Type":"ContainerStarted","Data":"c2bec4193febf1b787af9c6fc55f96307859362109bc0c19671a585e2c88066a"} Oct 13 19:02:25 crc kubenswrapper[4974]: I1013 19:02:25.633757 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-2-0" event={"ID":"688ac6ef-82eb-421b-9949-a832b8a73319","Type":"ContainerStarted","Data":"7035dd7ae81d57a00f9355b9f22a0634ff967c6ff16d42ea7076ca334d87c143"} Oct 13 19:02:25 crc kubenswrapper[4974]: I1013 19:02:25.635597 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-nfs-0" event={"ID":"90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e","Type":"ContainerStarted","Data":"8960d528bafaaab376f12651d6644c81fa82883553c5da4eb2692abc86962dce"} Oct 13 19:02:25 crc kubenswrapper[4974]: I1013 19:02:25.638196 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"57963e15-5fff-4158-ad83-0e4bd2ca1f7f","Type":"ContainerStarted","Data":"f6598f741e64d3e706fa327e167a7224d3d9c70cb7b28eb38f700a64de67bb88"} Oct 13 19:02:25 crc kubenswrapper[4974]: I1013 19:02:25.679678 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-2-0" podStartSLOduration=2.3693991309999998 podStartE2EDuration="2.679630957s" podCreationTimestamp="2025-10-13 19:02:23 +0000 UTC" firstStartedPulling="2025-10-13 19:02:24.076268689 +0000 UTC m=+2878.980634769" lastFinishedPulling="2025-10-13 19:02:24.386500515 +0000 UTC m=+2879.290866595" observedRunningTime="2025-10-13 19:02:25.667773383 +0000 UTC m=+2880.572139493" watchObservedRunningTime="2025-10-13 19:02:25.679630957 +0000 UTC m=+2880.583997067" Oct 13 19:02:25 crc kubenswrapper[4974]: I1013 19:02:25.710139 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-nfs-0" podStartSLOduration=3.396804865 podStartE2EDuration="3.710116367s" podCreationTimestamp="2025-10-13 19:02:22 +0000 UTC" firstStartedPulling="2025-10-13 19:02:24.073954194 +0000 UTC m=+2878.978320274" lastFinishedPulling="2025-10-13 19:02:24.387265696 +0000 UTC m=+2879.291631776" observedRunningTime="2025-10-13 19:02:25.696101191 +0000 UTC m=+2880.600467291" watchObservedRunningTime="2025-10-13 19:02:25.710116367 +0000 UTC m=+2880.614482457" Oct 13 19:02:25 crc kubenswrapper[4974]: I1013 19:02:25.723776 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.4554722780000002 podStartE2EDuration="3.723755831s" podCreationTimestamp="2025-10-13 19:02:22 +0000 UTC" firstStartedPulling="2025-10-13 19:02:23.873490483 +0000 UTC m=+2878.777856563" lastFinishedPulling="2025-10-13 19:02:24.141774036 +0000 UTC m=+2879.046140116" observedRunningTime="2025-10-13 19:02:25.720189111 +0000 UTC m=+2880.624555191" watchObservedRunningTime="2025-10-13 19:02:25.723755831 +0000 UTC m=+2880.628121921" Oct 13 19:02:28 crc kubenswrapper[4974]: I1013 19:02:28.224412 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Oct 13 19:02:28 crc kubenswrapper[4974]: I1013 19:02:28.316615 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:28 crc kubenswrapper[4974]: I1013 19:02:28.319461 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:33 crc kubenswrapper[4974]: I1013 19:02:33.392914 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Oct 13 19:02:33 crc kubenswrapper[4974]: I1013 19:02:33.481257 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-0" Oct 13 19:02:33 crc kubenswrapper[4974]: I1013 19:02:33.627819 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-nfs-2-0" Oct 13 19:02:37 crc kubenswrapper[4974]: I1013 19:02:37.742795 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:02:37 crc kubenswrapper[4974]: I1013 19:02:37.743499 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:03:07 crc kubenswrapper[4974]: I1013 19:03:07.743008 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:03:07 crc kubenswrapper[4974]: I1013 19:03:07.744520 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:03:07 crc kubenswrapper[4974]: I1013 19:03:07.744817 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 19:03:07 crc kubenswrapper[4974]: I1013 19:03:07.749715 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"47cb04159faadae1b5ae1bef9861f9e72b533c227670e91c75a48fa646414e60"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 19:03:07 crc kubenswrapper[4974]: I1013 19:03:07.749827 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://47cb04159faadae1b5ae1bef9861f9e72b533c227670e91c75a48fa646414e60" gracePeriod=600 Oct 13 19:03:08 crc kubenswrapper[4974]: I1013 19:03:08.235803 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="47cb04159faadae1b5ae1bef9861f9e72b533c227670e91c75a48fa646414e60" exitCode=0 Oct 13 19:03:08 crc kubenswrapper[4974]: I1013 19:03:08.235976 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"47cb04159faadae1b5ae1bef9861f9e72b533c227670e91c75a48fa646414e60"} Oct 13 19:03:08 crc kubenswrapper[4974]: I1013 19:03:08.236156 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b"} Oct 13 19:03:08 crc kubenswrapper[4974]: I1013 19:03:08.236199 4974 scope.go:117] "RemoveContainer" containerID="f6f39008957ae4a5e03e0f3f84f0b29cdac6e7af1cc3afb3763d0b1875b0bbe1" Oct 13 19:03:26 crc kubenswrapper[4974]: I1013 19:03:26.140311 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 19:03:26 crc kubenswrapper[4974]: I1013 19:03:26.141890 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="prometheus" containerID="cri-o://8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6" gracePeriod=600 Oct 13 19:03:26 crc kubenswrapper[4974]: I1013 19:03:26.142317 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="thanos-sidecar" containerID="cri-o://0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303" gracePeriod=600 Oct 13 19:03:26 crc kubenswrapper[4974]: I1013 19:03:26.142425 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="config-reloader" containerID="cri-o://30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0" gracePeriod=600 Oct 13 19:03:26 crc kubenswrapper[4974]: I1013 19:03:26.454679 4974 generic.go:334] "Generic (PLEG): container finished" podID="7f3cf41f-a929-4f3c-9063-682924915904" containerID="0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303" exitCode=0 Oct 13 19:03:26 crc kubenswrapper[4974]: I1013 19:03:26.455055 4974 generic.go:334] "Generic (PLEG): container finished" podID="7f3cf41f-a929-4f3c-9063-682924915904" containerID="8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6" exitCode=0 Oct 13 19:03:26 crc kubenswrapper[4974]: I1013 19:03:26.454729 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7f3cf41f-a929-4f3c-9063-682924915904","Type":"ContainerDied","Data":"0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303"} Oct 13 19:03:26 crc kubenswrapper[4974]: I1013 19:03:26.455106 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7f3cf41f-a929-4f3c-9063-682924915904","Type":"ContainerDied","Data":"8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6"} Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.094191 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.212671 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-thanos-prometheus-http-client-file\") pod \"7f3cf41f-a929-4f3c-9063-682924915904\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.212760 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config\") pod \"7f3cf41f-a929-4f3c-9063-682924915904\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.212810 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-secret-combined-ca-bundle\") pod \"7f3cf41f-a929-4f3c-9063-682924915904\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.213578 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"7f3cf41f-a929-4f3c-9063-682924915904\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.213644 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwc77\" (UniqueName: \"kubernetes.io/projected/7f3cf41f-a929-4f3c-9063-682924915904-kube-api-access-pwc77\") pod \"7f3cf41f-a929-4f3c-9063-682924915904\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.213790 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7f3cf41f-a929-4f3c-9063-682924915904-prometheus-metric-storage-rulefiles-0\") pod \"7f3cf41f-a929-4f3c-9063-682924915904\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.213813 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7f3cf41f-a929-4f3c-9063-682924915904-tls-assets\") pod \"7f3cf41f-a929-4f3c-9063-682924915904\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.213832 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-config\") pod \"7f3cf41f-a929-4f3c-9063-682924915904\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.213863 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7f3cf41f-a929-4f3c-9063-682924915904-config-out\") pod \"7f3cf41f-a929-4f3c-9063-682924915904\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.213950 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"7f3cf41f-a929-4f3c-9063-682924915904\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.213972 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"7f3cf41f-a929-4f3c-9063-682924915904\" (UID: \"7f3cf41f-a929-4f3c-9063-682924915904\") " Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.221029 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f3cf41f-a929-4f3c-9063-682924915904-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "7f3cf41f-a929-4f3c-9063-682924915904" (UID: "7f3cf41f-a929-4f3c-9063-682924915904"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.222804 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "7f3cf41f-a929-4f3c-9063-682924915904" (UID: "7f3cf41f-a929-4f3c-9063-682924915904"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.223921 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "7f3cf41f-a929-4f3c-9063-682924915904" (UID: "7f3cf41f-a929-4f3c-9063-682924915904"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.224154 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f3cf41f-a929-4f3c-9063-682924915904-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "7f3cf41f-a929-4f3c-9063-682924915904" (UID: "7f3cf41f-a929-4f3c-9063-682924915904"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.225023 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f3cf41f-a929-4f3c-9063-682924915904-config-out" (OuterVolumeSpecName: "config-out") pod "7f3cf41f-a929-4f3c-9063-682924915904" (UID: "7f3cf41f-a929-4f3c-9063-682924915904"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.234834 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "7f3cf41f-a929-4f3c-9063-682924915904" (UID: "7f3cf41f-a929-4f3c-9063-682924915904"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.239862 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "7f3cf41f-a929-4f3c-9063-682924915904" (UID: "7f3cf41f-a929-4f3c-9063-682924915904"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.242829 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-config" (OuterVolumeSpecName: "config") pod "7f3cf41f-a929-4f3c-9063-682924915904" (UID: "7f3cf41f-a929-4f3c-9063-682924915904"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.245127 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f3cf41f-a929-4f3c-9063-682924915904-kube-api-access-pwc77" (OuterVolumeSpecName: "kube-api-access-pwc77") pod "7f3cf41f-a929-4f3c-9063-682924915904" (UID: "7f3cf41f-a929-4f3c-9063-682924915904"). InnerVolumeSpecName "kube-api-access-pwc77". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.292160 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "7f3cf41f-a929-4f3c-9063-682924915904" (UID: "7f3cf41f-a929-4f3c-9063-682924915904"). InnerVolumeSpecName "pvc-1fd59f2a-5ebb-4374-98ce-075e681db308". PluginName "kubernetes.io/csi", VolumeGidValue "" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.316281 4974 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7f3cf41f-a929-4f3c-9063-682924915904-config-out\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.316314 4974 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.316327 4974 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.316338 4974 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.316348 4974 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.316371 4974 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") on node \"crc\" " Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.316380 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwc77\" (UniqueName: \"kubernetes.io/projected/7f3cf41f-a929-4f3c-9063-682924915904-kube-api-access-pwc77\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.316389 4974 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7f3cf41f-a929-4f3c-9063-682924915904-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.316399 4974 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7f3cf41f-a929-4f3c-9063-682924915904-tls-assets\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.316408 4974 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-config\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.353578 4974 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.354796 4974 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-1fd59f2a-5ebb-4374-98ce-075e681db308" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308") on node "crc" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.355950 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config" (OuterVolumeSpecName: "web-config") pod "7f3cf41f-a929-4f3c-9063-682924915904" (UID: "7f3cf41f-a929-4f3c-9063-682924915904"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.419793 4974 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7f3cf41f-a929-4f3c-9063-682924915904-web-config\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.419817 4974 reconciler_common.go:293] "Volume detached for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.466796 4974 generic.go:334] "Generic (PLEG): container finished" podID="7f3cf41f-a929-4f3c-9063-682924915904" containerID="30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0" exitCode=0 Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.466852 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7f3cf41f-a929-4f3c-9063-682924915904","Type":"ContainerDied","Data":"30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0"} Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.466918 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"7f3cf41f-a929-4f3c-9063-682924915904","Type":"ContainerDied","Data":"ea9183d20ca0204634cade5fcfc457aaa5f1a6e3b5a910e030e4ececcc8003c6"} Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.466941 4974 scope.go:117] "RemoveContainer" containerID="0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.467180 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.494036 4974 scope.go:117] "RemoveContainer" containerID="30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.505603 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.516086 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.528847 4974 scope.go:117] "RemoveContainer" containerID="8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.548738 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 19:03:27 crc kubenswrapper[4974]: E1013 19:03:27.549220 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="prometheus" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.549239 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="prometheus" Oct 13 19:03:27 crc kubenswrapper[4974]: E1013 19:03:27.549248 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="config-reloader" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.549254 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="config-reloader" Oct 13 19:03:27 crc kubenswrapper[4974]: E1013 19:03:27.549267 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="thanos-sidecar" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.549275 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="thanos-sidecar" Oct 13 19:03:27 crc kubenswrapper[4974]: E1013 19:03:27.549284 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="init-config-reloader" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.549291 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="init-config-reloader" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.549467 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="prometheus" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.549480 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="thanos-sidecar" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.549491 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3cf41f-a929-4f3c-9063-682924915904" containerName="config-reloader" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.553189 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.560686 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.564457 4974 scope.go:117] "RemoveContainer" containerID="6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.567680 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.567911 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.568385 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-97gxg" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.568603 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.571020 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.573333 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.602320 4974 scope.go:117] "RemoveContainer" containerID="0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303" Oct 13 19:03:27 crc kubenswrapper[4974]: E1013 19:03:27.604819 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303\": container with ID starting with 0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303 not found: ID does not exist" containerID="0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.604859 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303"} err="failed to get container status \"0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303\": rpc error: code = NotFound desc = could not find container \"0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303\": container with ID starting with 0611b964ce77046823a5a7b65e59e8279ff4b119976fbb44538e1fa07576a303 not found: ID does not exist" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.604883 4974 scope.go:117] "RemoveContainer" containerID="30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0" Oct 13 19:03:27 crc kubenswrapper[4974]: E1013 19:03:27.605581 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0\": container with ID starting with 30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0 not found: ID does not exist" containerID="30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.605609 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0"} err="failed to get container status \"30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0\": rpc error: code = NotFound desc = could not find container \"30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0\": container with ID starting with 30eef0912e4de1a659287257af5f290c5c813dfa912ad9f259b2880d7a8867c0 not found: ID does not exist" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.605622 4974 scope.go:117] "RemoveContainer" containerID="8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6" Oct 13 19:03:27 crc kubenswrapper[4974]: E1013 19:03:27.605867 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6\": container with ID starting with 8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6 not found: ID does not exist" containerID="8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.605883 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6"} err="failed to get container status \"8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6\": rpc error: code = NotFound desc = could not find container \"8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6\": container with ID starting with 8056974bdb053872176bd3aa4dceb9e37529e9be9aa68b843cc85a4bf8fe55c6 not found: ID does not exist" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.605900 4974 scope.go:117] "RemoveContainer" containerID="6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd" Oct 13 19:03:27 crc kubenswrapper[4974]: E1013 19:03:27.606194 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd\": container with ID starting with 6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd not found: ID does not exist" containerID="6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.606219 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd"} err="failed to get container status \"6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd\": rpc error: code = NotFound desc = could not find container \"6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd\": container with ID starting with 6663b5483e854309ea78f20f24b36561658965f9c1c6a8c4b04177a2195102cd not found: ID does not exist" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.723530 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24cb3219-26af-43ab-95da-2320e69129db-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.723663 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.723700 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.723864 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcscv\" (UniqueName: \"kubernetes.io/projected/24cb3219-26af-43ab-95da-2320e69129db-kube-api-access-tcscv\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.723961 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-config\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.724003 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.724024 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.724070 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.724153 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.724174 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24cb3219-26af-43ab-95da-2320e69129db-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.724201 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/24cb3219-26af-43ab-95da-2320e69129db-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.822297 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f3cf41f-a929-4f3c-9063-682924915904" path="/var/lib/kubelet/pods/7f3cf41f-a929-4f3c-9063-682924915904/volumes" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.825220 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/24cb3219-26af-43ab-95da-2320e69129db-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.825286 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24cb3219-26af-43ab-95da-2320e69129db-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.825346 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.825372 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.825394 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcscv\" (UniqueName: \"kubernetes.io/projected/24cb3219-26af-43ab-95da-2320e69129db-kube-api-access-tcscv\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.825425 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-config\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.825456 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.825476 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.825519 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.825571 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.825590 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24cb3219-26af-43ab-95da-2320e69129db-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.826179 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/24cb3219-26af-43ab-95da-2320e69129db-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.827753 4974 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.827793 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/e453070dec09825da50fcd48128605195703a3e04c8868309f22a520ea4896c6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.830044 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-config\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.830071 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/24cb3219-26af-43ab-95da-2320e69129db-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.830681 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.831492 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.832545 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.838702 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.838964 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/24cb3219-26af-43ab-95da-2320e69129db-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.839429 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/24cb3219-26af-43ab-95da-2320e69129db-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.852238 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcscv\" (UniqueName: \"kubernetes.io/projected/24cb3219-26af-43ab-95da-2320e69129db-kube-api-access-tcscv\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:27 crc kubenswrapper[4974]: I1013 19:03:27.884130 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1fd59f2a-5ebb-4374-98ce-075e681db308\") pod \"prometheus-metric-storage-0\" (UID: \"24cb3219-26af-43ab-95da-2320e69129db\") " pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:28 crc kubenswrapper[4974]: I1013 19:03:28.178527 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:28 crc kubenswrapper[4974]: I1013 19:03:28.717920 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Oct 13 19:03:29 crc kubenswrapper[4974]: I1013 19:03:29.518056 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"24cb3219-26af-43ab-95da-2320e69129db","Type":"ContainerStarted","Data":"101877fb31abb247b4ac6a89d600ebc452417d1c20a9a84f980edf8df533bedc"} Oct 13 19:03:33 crc kubenswrapper[4974]: I1013 19:03:33.571971 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"24cb3219-26af-43ab-95da-2320e69129db","Type":"ContainerStarted","Data":"761b18ab974d3762fd9ee608a1dd16cedfe6f3633d1ce5a9ddce4873c288eb81"} Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.200394 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fqnx2"] Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.204316 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.228990 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqnx2"] Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.270996 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-catalog-content\") pod \"redhat-marketplace-fqnx2\" (UID: \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\") " pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.271260 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5h65\" (UniqueName: \"kubernetes.io/projected/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-kube-api-access-q5h65\") pod \"redhat-marketplace-fqnx2\" (UID: \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\") " pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.271639 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-utilities\") pod \"redhat-marketplace-fqnx2\" (UID: \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\") " pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.374013 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5h65\" (UniqueName: \"kubernetes.io/projected/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-kube-api-access-q5h65\") pod \"redhat-marketplace-fqnx2\" (UID: \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\") " pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.374359 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-utilities\") pod \"redhat-marketplace-fqnx2\" (UID: \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\") " pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.374546 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-catalog-content\") pod \"redhat-marketplace-fqnx2\" (UID: \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\") " pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.375171 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-utilities\") pod \"redhat-marketplace-fqnx2\" (UID: \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\") " pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.375177 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-catalog-content\") pod \"redhat-marketplace-fqnx2\" (UID: \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\") " pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.406537 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5h65\" (UniqueName: \"kubernetes.io/projected/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-kube-api-access-q5h65\") pod \"redhat-marketplace-fqnx2\" (UID: \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\") " pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:41 crc kubenswrapper[4974]: I1013 19:03:41.540306 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:42 crc kubenswrapper[4974]: I1013 19:03:42.077341 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqnx2"] Oct 13 19:03:42 crc kubenswrapper[4974]: I1013 19:03:42.711859 4974 generic.go:334] "Generic (PLEG): container finished" podID="6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" containerID="ca24341c266c7b752aeafd933b913d15b943cb043c96bb6fee0389b53870a371" exitCode=0 Oct 13 19:03:42 crc kubenswrapper[4974]: I1013 19:03:42.711953 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqnx2" event={"ID":"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a","Type":"ContainerDied","Data":"ca24341c266c7b752aeafd933b913d15b943cb043c96bb6fee0389b53870a371"} Oct 13 19:03:42 crc kubenswrapper[4974]: I1013 19:03:42.712604 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqnx2" event={"ID":"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a","Type":"ContainerStarted","Data":"1dc4cb87e0b7b9f84a255f4d9c245736277713d059995967e83061704718c9be"} Oct 13 19:03:44 crc kubenswrapper[4974]: I1013 19:03:44.737437 4974 generic.go:334] "Generic (PLEG): container finished" podID="24cb3219-26af-43ab-95da-2320e69129db" containerID="761b18ab974d3762fd9ee608a1dd16cedfe6f3633d1ce5a9ddce4873c288eb81" exitCode=0 Oct 13 19:03:44 crc kubenswrapper[4974]: I1013 19:03:44.737541 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"24cb3219-26af-43ab-95da-2320e69129db","Type":"ContainerDied","Data":"761b18ab974d3762fd9ee608a1dd16cedfe6f3633d1ce5a9ddce4873c288eb81"} Oct 13 19:03:44 crc kubenswrapper[4974]: I1013 19:03:44.742855 4974 generic.go:334] "Generic (PLEG): container finished" podID="6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" containerID="4137fbba2f867a92af30ac889d6e1fa8bf780852d1c11d5e26dea08629e608bb" exitCode=0 Oct 13 19:03:44 crc kubenswrapper[4974]: I1013 19:03:44.742904 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqnx2" event={"ID":"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a","Type":"ContainerDied","Data":"4137fbba2f867a92af30ac889d6e1fa8bf780852d1c11d5e26dea08629e608bb"} Oct 13 19:03:45 crc kubenswrapper[4974]: I1013 19:03:45.756940 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"24cb3219-26af-43ab-95da-2320e69129db","Type":"ContainerStarted","Data":"cc6859af7ea2c4e11ba4a568285f2b513272f260c782f243de01506ccc19e480"} Oct 13 19:03:45 crc kubenswrapper[4974]: I1013 19:03:45.760427 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqnx2" event={"ID":"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a","Type":"ContainerStarted","Data":"aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef"} Oct 13 19:03:45 crc kubenswrapper[4974]: I1013 19:03:45.783905 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fqnx2" podStartSLOduration=2.345837217 podStartE2EDuration="4.7838846s" podCreationTimestamp="2025-10-13 19:03:41 +0000 UTC" firstStartedPulling="2025-10-13 19:03:42.713825084 +0000 UTC m=+2957.618191174" lastFinishedPulling="2025-10-13 19:03:45.151872477 +0000 UTC m=+2960.056238557" observedRunningTime="2025-10-13 19:03:45.783074447 +0000 UTC m=+2960.687440567" watchObservedRunningTime="2025-10-13 19:03:45.7838846 +0000 UTC m=+2960.688250690" Oct 13 19:03:49 crc kubenswrapper[4974]: I1013 19:03:49.822267 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"24cb3219-26af-43ab-95da-2320e69129db","Type":"ContainerStarted","Data":"2d6c97d9df360fd793e30c6f25fcd7ac7553126314ff5859b56c79aed21cc80c"} Oct 13 19:03:49 crc kubenswrapper[4974]: I1013 19:03:49.822836 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"24cb3219-26af-43ab-95da-2320e69129db","Type":"ContainerStarted","Data":"c7eab4369a0c3902a9d1f9b725c50e8718acdb97db3bc4f607833b9eac87d45b"} Oct 13 19:03:49 crc kubenswrapper[4974]: I1013 19:03:49.852826 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.852811121 podStartE2EDuration="22.852811121s" podCreationTimestamp="2025-10-13 19:03:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 19:03:49.845454264 +0000 UTC m=+2964.749820364" watchObservedRunningTime="2025-10-13 19:03:49.852811121 +0000 UTC m=+2964.757177201" Oct 13 19:03:51 crc kubenswrapper[4974]: I1013 19:03:51.541618 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:51 crc kubenswrapper[4974]: I1013 19:03:51.542191 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:51 crc kubenswrapper[4974]: I1013 19:03:51.646149 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:51 crc kubenswrapper[4974]: I1013 19:03:51.910142 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:51 crc kubenswrapper[4974]: I1013 19:03:51.956583 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqnx2"] Oct 13 19:03:53 crc kubenswrapper[4974]: I1013 19:03:53.179029 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:53 crc kubenswrapper[4974]: I1013 19:03:53.881965 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fqnx2" podUID="6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" containerName="registry-server" containerID="cri-o://aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef" gracePeriod=2 Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.431236 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.539185 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-utilities\") pod \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\" (UID: \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\") " Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.539380 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5h65\" (UniqueName: \"kubernetes.io/projected/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-kube-api-access-q5h65\") pod \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\" (UID: \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\") " Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.539447 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-catalog-content\") pod \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\" (UID: \"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a\") " Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.540081 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-utilities" (OuterVolumeSpecName: "utilities") pod "6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" (UID: "6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.546036 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-kube-api-access-q5h65" (OuterVolumeSpecName: "kube-api-access-q5h65") pod "6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" (UID: "6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a"). InnerVolumeSpecName "kube-api-access-q5h65". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.566614 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" (UID: "6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.641637 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.641690 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5h65\" (UniqueName: \"kubernetes.io/projected/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-kube-api-access-q5h65\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.641707 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.902842 4974 generic.go:334] "Generic (PLEG): container finished" podID="6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" containerID="aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef" exitCode=0 Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.902896 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqnx2" event={"ID":"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a","Type":"ContainerDied","Data":"aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef"} Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.902936 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fqnx2" event={"ID":"6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a","Type":"ContainerDied","Data":"1dc4cb87e0b7b9f84a255f4d9c245736277713d059995967e83061704718c9be"} Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.902983 4974 scope.go:117] "RemoveContainer" containerID="aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef" Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.903002 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fqnx2" Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.947273 4974 scope.go:117] "RemoveContainer" containerID="4137fbba2f867a92af30ac889d6e1fa8bf780852d1c11d5e26dea08629e608bb" Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.965854 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqnx2"] Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.976112 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fqnx2"] Oct 13 19:03:54 crc kubenswrapper[4974]: I1013 19:03:54.993304 4974 scope.go:117] "RemoveContainer" containerID="ca24341c266c7b752aeafd933b913d15b943cb043c96bb6fee0389b53870a371" Oct 13 19:03:55 crc kubenswrapper[4974]: I1013 19:03:55.060036 4974 scope.go:117] "RemoveContainer" containerID="aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef" Oct 13 19:03:55 crc kubenswrapper[4974]: E1013 19:03:55.060919 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef\": container with ID starting with aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef not found: ID does not exist" containerID="aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef" Oct 13 19:03:55 crc kubenswrapper[4974]: I1013 19:03:55.061114 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef"} err="failed to get container status \"aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef\": rpc error: code = NotFound desc = could not find container \"aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef\": container with ID starting with aa63661f98668836c0555537552224d77fcaa565f42dbde2c37be9c5e7e610ef not found: ID does not exist" Oct 13 19:03:55 crc kubenswrapper[4974]: I1013 19:03:55.061191 4974 scope.go:117] "RemoveContainer" containerID="4137fbba2f867a92af30ac889d6e1fa8bf780852d1c11d5e26dea08629e608bb" Oct 13 19:03:55 crc kubenswrapper[4974]: E1013 19:03:55.061902 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4137fbba2f867a92af30ac889d6e1fa8bf780852d1c11d5e26dea08629e608bb\": container with ID starting with 4137fbba2f867a92af30ac889d6e1fa8bf780852d1c11d5e26dea08629e608bb not found: ID does not exist" containerID="4137fbba2f867a92af30ac889d6e1fa8bf780852d1c11d5e26dea08629e608bb" Oct 13 19:03:55 crc kubenswrapper[4974]: I1013 19:03:55.061970 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4137fbba2f867a92af30ac889d6e1fa8bf780852d1c11d5e26dea08629e608bb"} err="failed to get container status \"4137fbba2f867a92af30ac889d6e1fa8bf780852d1c11d5e26dea08629e608bb\": rpc error: code = NotFound desc = could not find container \"4137fbba2f867a92af30ac889d6e1fa8bf780852d1c11d5e26dea08629e608bb\": container with ID starting with 4137fbba2f867a92af30ac889d6e1fa8bf780852d1c11d5e26dea08629e608bb not found: ID does not exist" Oct 13 19:03:55 crc kubenswrapper[4974]: I1013 19:03:55.062000 4974 scope.go:117] "RemoveContainer" containerID="ca24341c266c7b752aeafd933b913d15b943cb043c96bb6fee0389b53870a371" Oct 13 19:03:55 crc kubenswrapper[4974]: E1013 19:03:55.062430 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca24341c266c7b752aeafd933b913d15b943cb043c96bb6fee0389b53870a371\": container with ID starting with ca24341c266c7b752aeafd933b913d15b943cb043c96bb6fee0389b53870a371 not found: ID does not exist" containerID="ca24341c266c7b752aeafd933b913d15b943cb043c96bb6fee0389b53870a371" Oct 13 19:03:55 crc kubenswrapper[4974]: I1013 19:03:55.062469 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca24341c266c7b752aeafd933b913d15b943cb043c96bb6fee0389b53870a371"} err="failed to get container status \"ca24341c266c7b752aeafd933b913d15b943cb043c96bb6fee0389b53870a371\": rpc error: code = NotFound desc = could not find container \"ca24341c266c7b752aeafd933b913d15b943cb043c96bb6fee0389b53870a371\": container with ID starting with ca24341c266c7b752aeafd933b913d15b943cb043c96bb6fee0389b53870a371 not found: ID does not exist" Oct 13 19:03:55 crc kubenswrapper[4974]: I1013 19:03:55.831740 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" path="/var/lib/kubelet/pods/6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a/volumes" Oct 13 19:03:58 crc kubenswrapper[4974]: I1013 19:03:58.179328 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:58 crc kubenswrapper[4974]: I1013 19:03:58.191431 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Oct 13 19:03:58 crc kubenswrapper[4974]: I1013 19:03:58.978325 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.501352 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Oct 13 19:04:23 crc kubenswrapper[4974]: E1013 19:04:23.503002 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" containerName="extract-content" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.503060 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" containerName="extract-content" Oct 13 19:04:23 crc kubenswrapper[4974]: E1013 19:04:23.503146 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" containerName="registry-server" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.503160 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" containerName="registry-server" Oct 13 19:04:23 crc kubenswrapper[4974]: E1013 19:04:23.503189 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" containerName="extract-utilities" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.503231 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" containerName="extract-utilities" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.503790 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="6600a4d3-e4fe-45c2-8bc8-9672b38a2b5a" containerName="registry-server" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.505398 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.507739 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fxpsb" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.507918 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.508780 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.511778 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.520226 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.583773 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98e331dd-24d4-4707-b432-557ea90e6048-config-data\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.583849 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98e331dd-24d4-4707-b432-557ea90e6048-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.584916 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.585004 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/98e331dd-24d4-4707-b432-557ea90e6048-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.585106 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.585176 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/98e331dd-24d4-4707-b432-557ea90e6048-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.585255 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.585385 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.585559 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qtmw\" (UniqueName: \"kubernetes.io/projected/98e331dd-24d4-4707-b432-557ea90e6048-kube-api-access-2qtmw\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.686968 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qtmw\" (UniqueName: \"kubernetes.io/projected/98e331dd-24d4-4707-b432-557ea90e6048-kube-api-access-2qtmw\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.687347 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98e331dd-24d4-4707-b432-557ea90e6048-config-data\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.687384 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98e331dd-24d4-4707-b432-557ea90e6048-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.687440 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.687457 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/98e331dd-24d4-4707-b432-557ea90e6048-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.687502 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.687524 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/98e331dd-24d4-4707-b432-557ea90e6048-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.687550 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.687599 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.687973 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.688404 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/98e331dd-24d4-4707-b432-557ea90e6048-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.688730 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/98e331dd-24d4-4707-b432-557ea90e6048-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.692842 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98e331dd-24d4-4707-b432-557ea90e6048-config-data\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.695020 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.695364 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98e331dd-24d4-4707-b432-557ea90e6048-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.700825 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.711896 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.717860 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qtmw\" (UniqueName: \"kubernetes.io/projected/98e331dd-24d4-4707-b432-557ea90e6048-kube-api-access-2qtmw\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.732927 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"tempest-tests-tempest\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " pod="openstack/tempest-tests-tempest" Oct 13 19:04:23 crc kubenswrapper[4974]: I1013 19:04:23.870238 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 13 19:04:24 crc kubenswrapper[4974]: I1013 19:04:24.355365 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Oct 13 19:04:24 crc kubenswrapper[4974]: W1013 19:04:24.369932 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98e331dd_24d4_4707_b432_557ea90e6048.slice/crio-7c76a62293e468edfb8209a299d314723ff77a80209521dc39d72dee32c784df WatchSource:0}: Error finding container 7c76a62293e468edfb8209a299d314723ff77a80209521dc39d72dee32c784df: Status 404 returned error can't find the container with id 7c76a62293e468edfb8209a299d314723ff77a80209521dc39d72dee32c784df Oct 13 19:04:25 crc kubenswrapper[4974]: I1013 19:04:25.304506 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"98e331dd-24d4-4707-b432-557ea90e6048","Type":"ContainerStarted","Data":"7c76a62293e468edfb8209a299d314723ff77a80209521dc39d72dee32c784df"} Oct 13 19:04:33 crc kubenswrapper[4974]: I1013 19:04:33.547257 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Oct 13 19:04:35 crc kubenswrapper[4974]: I1013 19:04:35.437394 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"98e331dd-24d4-4707-b432-557ea90e6048","Type":"ContainerStarted","Data":"91c6df01012750b43cc81cf7c58c2e656d872828100e2626d6b82a19b1144619"} Oct 13 19:04:35 crc kubenswrapper[4974]: I1013 19:04:35.461372 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.290030435 podStartE2EDuration="13.461351938s" podCreationTimestamp="2025-10-13 19:04:22 +0000 UTC" firstStartedPulling="2025-10-13 19:04:24.372839243 +0000 UTC m=+2999.277205353" lastFinishedPulling="2025-10-13 19:04:33.544160766 +0000 UTC m=+3008.448526856" observedRunningTime="2025-10-13 19:04:35.456486311 +0000 UTC m=+3010.360852431" watchObservedRunningTime="2025-10-13 19:04:35.461351938 +0000 UTC m=+3010.365718058" Oct 13 19:05:37 crc kubenswrapper[4974]: I1013 19:05:37.742732 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:05:37 crc kubenswrapper[4974]: I1013 19:05:37.743550 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:06:07 crc kubenswrapper[4974]: I1013 19:06:07.744074 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:06:07 crc kubenswrapper[4974]: I1013 19:06:07.744837 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:06:37 crc kubenswrapper[4974]: I1013 19:06:37.743527 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:06:37 crc kubenswrapper[4974]: I1013 19:06:37.745369 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:06:37 crc kubenswrapper[4974]: I1013 19:06:37.745506 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 19:06:37 crc kubenswrapper[4974]: I1013 19:06:37.746514 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 19:06:37 crc kubenswrapper[4974]: I1013 19:06:37.746721 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" gracePeriod=600 Oct 13 19:06:37 crc kubenswrapper[4974]: E1013 19:06:37.878166 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:06:37 crc kubenswrapper[4974]: I1013 19:06:37.916061 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" exitCode=0 Oct 13 19:06:37 crc kubenswrapper[4974]: I1013 19:06:37.916215 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b"} Oct 13 19:06:37 crc kubenswrapper[4974]: I1013 19:06:37.916278 4974 scope.go:117] "RemoveContainer" containerID="47cb04159faadae1b5ae1bef9861f9e72b533c227670e91c75a48fa646414e60" Oct 13 19:06:37 crc kubenswrapper[4974]: I1013 19:06:37.917315 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:06:37 crc kubenswrapper[4974]: E1013 19:06:37.918013 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:06:51 crc kubenswrapper[4974]: I1013 19:06:51.811546 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:06:51 crc kubenswrapper[4974]: E1013 19:06:51.812529 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:07:02 crc kubenswrapper[4974]: I1013 19:07:02.811998 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:07:02 crc kubenswrapper[4974]: E1013 19:07:02.813019 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:07:14 crc kubenswrapper[4974]: I1013 19:07:14.812476 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:07:14 crc kubenswrapper[4974]: E1013 19:07:14.813591 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:07:26 crc kubenswrapper[4974]: I1013 19:07:26.811957 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:07:26 crc kubenswrapper[4974]: E1013 19:07:26.813138 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:07:41 crc kubenswrapper[4974]: I1013 19:07:41.812929 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:07:41 crc kubenswrapper[4974]: E1013 19:07:41.814176 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:07:54 crc kubenswrapper[4974]: I1013 19:07:54.811636 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:07:54 crc kubenswrapper[4974]: E1013 19:07:54.812293 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:08:05 crc kubenswrapper[4974]: I1013 19:08:05.835370 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:08:05 crc kubenswrapper[4974]: E1013 19:08:05.836197 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:08:17 crc kubenswrapper[4974]: I1013 19:08:17.812716 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:08:17 crc kubenswrapper[4974]: E1013 19:08:17.813699 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:08:31 crc kubenswrapper[4974]: I1013 19:08:31.812003 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:08:31 crc kubenswrapper[4974]: E1013 19:08:31.813182 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:08:44 crc kubenswrapper[4974]: I1013 19:08:44.812681 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:08:44 crc kubenswrapper[4974]: E1013 19:08:44.813713 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:08:57 crc kubenswrapper[4974]: I1013 19:08:57.812943 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:08:57 crc kubenswrapper[4974]: E1013 19:08:57.814371 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:09:08 crc kubenswrapper[4974]: I1013 19:09:08.811459 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:09:08 crc kubenswrapper[4974]: E1013 19:09:08.812584 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:09:21 crc kubenswrapper[4974]: I1013 19:09:21.812354 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:09:21 crc kubenswrapper[4974]: E1013 19:09:21.813322 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:09:35 crc kubenswrapper[4974]: I1013 19:09:35.826606 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:09:35 crc kubenswrapper[4974]: E1013 19:09:35.827690 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:09:50 crc kubenswrapper[4974]: I1013 19:09:50.812631 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:09:50 crc kubenswrapper[4974]: E1013 19:09:50.815438 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:10:02 crc kubenswrapper[4974]: I1013 19:10:02.813593 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:10:02 crc kubenswrapper[4974]: E1013 19:10:02.814617 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:10:13 crc kubenswrapper[4974]: I1013 19:10:13.812559 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:10:13 crc kubenswrapper[4974]: E1013 19:10:13.813480 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:10:24 crc kubenswrapper[4974]: I1013 19:10:24.812325 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:10:24 crc kubenswrapper[4974]: E1013 19:10:24.813314 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:10:37 crc kubenswrapper[4974]: I1013 19:10:37.812377 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:10:37 crc kubenswrapper[4974]: E1013 19:10:37.813467 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.654319 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dlp8w"] Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.657444 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.692253 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlp8w"] Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.720148 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822e6927-6f86-42e9-9d21-667b687dc274-utilities\") pod \"certified-operators-dlp8w\" (UID: \"822e6927-6f86-42e9-9d21-667b687dc274\") " pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.720241 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxkhn\" (UniqueName: \"kubernetes.io/projected/822e6927-6f86-42e9-9d21-667b687dc274-kube-api-access-jxkhn\") pod \"certified-operators-dlp8w\" (UID: \"822e6927-6f86-42e9-9d21-667b687dc274\") " pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.720293 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822e6927-6f86-42e9-9d21-667b687dc274-catalog-content\") pod \"certified-operators-dlp8w\" (UID: \"822e6927-6f86-42e9-9d21-667b687dc274\") " pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.823927 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822e6927-6f86-42e9-9d21-667b687dc274-utilities\") pod \"certified-operators-dlp8w\" (UID: \"822e6927-6f86-42e9-9d21-667b687dc274\") " pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.823997 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxkhn\" (UniqueName: \"kubernetes.io/projected/822e6927-6f86-42e9-9d21-667b687dc274-kube-api-access-jxkhn\") pod \"certified-operators-dlp8w\" (UID: \"822e6927-6f86-42e9-9d21-667b687dc274\") " pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.824032 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822e6927-6f86-42e9-9d21-667b687dc274-catalog-content\") pod \"certified-operators-dlp8w\" (UID: \"822e6927-6f86-42e9-9d21-667b687dc274\") " pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.825366 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822e6927-6f86-42e9-9d21-667b687dc274-catalog-content\") pod \"certified-operators-dlp8w\" (UID: \"822e6927-6f86-42e9-9d21-667b687dc274\") " pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.825822 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822e6927-6f86-42e9-9d21-667b687dc274-utilities\") pod \"certified-operators-dlp8w\" (UID: \"822e6927-6f86-42e9-9d21-667b687dc274\") " pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.856158 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxkhn\" (UniqueName: \"kubernetes.io/projected/822e6927-6f86-42e9-9d21-667b687dc274-kube-api-access-jxkhn\") pod \"certified-operators-dlp8w\" (UID: \"822e6927-6f86-42e9-9d21-667b687dc274\") " pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:47 crc kubenswrapper[4974]: I1013 19:10:47.992042 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:48 crc kubenswrapper[4974]: I1013 19:10:48.532030 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dlp8w"] Oct 13 19:10:48 crc kubenswrapper[4974]: I1013 19:10:48.938367 4974 generic.go:334] "Generic (PLEG): container finished" podID="822e6927-6f86-42e9-9d21-667b687dc274" containerID="228e1ec29c3908955c5827877aba7f85b14a5ede26530df9c1d07111f6921305" exitCode=0 Oct 13 19:10:48 crc kubenswrapper[4974]: I1013 19:10:48.938579 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlp8w" event={"ID":"822e6927-6f86-42e9-9d21-667b687dc274","Type":"ContainerDied","Data":"228e1ec29c3908955c5827877aba7f85b14a5ede26530df9c1d07111f6921305"} Oct 13 19:10:48 crc kubenswrapper[4974]: I1013 19:10:48.938834 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlp8w" event={"ID":"822e6927-6f86-42e9-9d21-667b687dc274","Type":"ContainerStarted","Data":"a3a9d52672d4db4516de547d85cedd4f72ae7eae3d94a58031baeadd752a6d78"} Oct 13 19:10:48 crc kubenswrapper[4974]: I1013 19:10:48.942394 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 19:10:49 crc kubenswrapper[4974]: I1013 19:10:49.811977 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:10:49 crc kubenswrapper[4974]: E1013 19:10:49.812450 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:10:50 crc kubenswrapper[4974]: I1013 19:10:50.963055 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlp8w" event={"ID":"822e6927-6f86-42e9-9d21-667b687dc274","Type":"ContainerStarted","Data":"1959200caf62339c8b53c749f3ed52060b441b25f5958e04d836827342f78f22"} Oct 13 19:10:51 crc kubenswrapper[4974]: I1013 19:10:51.974255 4974 generic.go:334] "Generic (PLEG): container finished" podID="822e6927-6f86-42e9-9d21-667b687dc274" containerID="1959200caf62339c8b53c749f3ed52060b441b25f5958e04d836827342f78f22" exitCode=0 Oct 13 19:10:51 crc kubenswrapper[4974]: I1013 19:10:51.974462 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlp8w" event={"ID":"822e6927-6f86-42e9-9d21-667b687dc274","Type":"ContainerDied","Data":"1959200caf62339c8b53c749f3ed52060b441b25f5958e04d836827342f78f22"} Oct 13 19:10:52 crc kubenswrapper[4974]: I1013 19:10:52.987600 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlp8w" event={"ID":"822e6927-6f86-42e9-9d21-667b687dc274","Type":"ContainerStarted","Data":"2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632"} Oct 13 19:10:53 crc kubenswrapper[4974]: I1013 19:10:53.008281 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dlp8w" podStartSLOduration=2.607701054 podStartE2EDuration="6.00826469s" podCreationTimestamp="2025-10-13 19:10:47 +0000 UTC" firstStartedPulling="2025-10-13 19:10:48.941866501 +0000 UTC m=+3383.846232621" lastFinishedPulling="2025-10-13 19:10:52.342430157 +0000 UTC m=+3387.246796257" observedRunningTime="2025-10-13 19:10:53.004165625 +0000 UTC m=+3387.908531725" watchObservedRunningTime="2025-10-13 19:10:53.00826469 +0000 UTC m=+3387.912630770" Oct 13 19:10:57 crc kubenswrapper[4974]: I1013 19:10:57.992709 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:57 crc kubenswrapper[4974]: I1013 19:10:57.993760 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:58 crc kubenswrapper[4974]: I1013 19:10:58.064374 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:59 crc kubenswrapper[4974]: I1013 19:10:59.141616 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:10:59 crc kubenswrapper[4974]: I1013 19:10:59.223711 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlp8w"] Oct 13 19:11:01 crc kubenswrapper[4974]: I1013 19:11:01.078982 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dlp8w" podUID="822e6927-6f86-42e9-9d21-667b687dc274" containerName="registry-server" containerID="cri-o://2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632" gracePeriod=2 Oct 13 19:11:01 crc kubenswrapper[4974]: I1013 19:11:01.602850 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:11:01 crc kubenswrapper[4974]: I1013 19:11:01.755942 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxkhn\" (UniqueName: \"kubernetes.io/projected/822e6927-6f86-42e9-9d21-667b687dc274-kube-api-access-jxkhn\") pod \"822e6927-6f86-42e9-9d21-667b687dc274\" (UID: \"822e6927-6f86-42e9-9d21-667b687dc274\") " Oct 13 19:11:01 crc kubenswrapper[4974]: I1013 19:11:01.756069 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822e6927-6f86-42e9-9d21-667b687dc274-catalog-content\") pod \"822e6927-6f86-42e9-9d21-667b687dc274\" (UID: \"822e6927-6f86-42e9-9d21-667b687dc274\") " Oct 13 19:11:01 crc kubenswrapper[4974]: I1013 19:11:01.756256 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822e6927-6f86-42e9-9d21-667b687dc274-utilities\") pod \"822e6927-6f86-42e9-9d21-667b687dc274\" (UID: \"822e6927-6f86-42e9-9d21-667b687dc274\") " Oct 13 19:11:01 crc kubenswrapper[4974]: I1013 19:11:01.757518 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822e6927-6f86-42e9-9d21-667b687dc274-utilities" (OuterVolumeSpecName: "utilities") pod "822e6927-6f86-42e9-9d21-667b687dc274" (UID: "822e6927-6f86-42e9-9d21-667b687dc274"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:11:01 crc kubenswrapper[4974]: I1013 19:11:01.762770 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/822e6927-6f86-42e9-9d21-667b687dc274-kube-api-access-jxkhn" (OuterVolumeSpecName: "kube-api-access-jxkhn") pod "822e6927-6f86-42e9-9d21-667b687dc274" (UID: "822e6927-6f86-42e9-9d21-667b687dc274"). InnerVolumeSpecName "kube-api-access-jxkhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:11:01 crc kubenswrapper[4974]: I1013 19:11:01.823583 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/822e6927-6f86-42e9-9d21-667b687dc274-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "822e6927-6f86-42e9-9d21-667b687dc274" (UID: "822e6927-6f86-42e9-9d21-667b687dc274"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:11:01 crc kubenswrapper[4974]: I1013 19:11:01.859916 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/822e6927-6f86-42e9-9d21-667b687dc274-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:11:01 crc kubenswrapper[4974]: I1013 19:11:01.859951 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxkhn\" (UniqueName: \"kubernetes.io/projected/822e6927-6f86-42e9-9d21-667b687dc274-kube-api-access-jxkhn\") on node \"crc\" DevicePath \"\"" Oct 13 19:11:01 crc kubenswrapper[4974]: I1013 19:11:01.859969 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/822e6927-6f86-42e9-9d21-667b687dc274-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.091624 4974 generic.go:334] "Generic (PLEG): container finished" podID="822e6927-6f86-42e9-9d21-667b687dc274" containerID="2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632" exitCode=0 Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.091706 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlp8w" event={"ID":"822e6927-6f86-42e9-9d21-667b687dc274","Type":"ContainerDied","Data":"2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632"} Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.091775 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dlp8w" event={"ID":"822e6927-6f86-42e9-9d21-667b687dc274","Type":"ContainerDied","Data":"a3a9d52672d4db4516de547d85cedd4f72ae7eae3d94a58031baeadd752a6d78"} Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.091802 4974 scope.go:117] "RemoveContainer" containerID="2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632" Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.091735 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dlp8w" Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.121914 4974 scope.go:117] "RemoveContainer" containerID="1959200caf62339c8b53c749f3ed52060b441b25f5958e04d836827342f78f22" Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.125325 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dlp8w"] Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.143235 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dlp8w"] Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.165365 4974 scope.go:117] "RemoveContainer" containerID="228e1ec29c3908955c5827877aba7f85b14a5ede26530df9c1d07111f6921305" Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.209148 4974 scope.go:117] "RemoveContainer" containerID="2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632" Oct 13 19:11:02 crc kubenswrapper[4974]: E1013 19:11:02.210053 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632\": container with ID starting with 2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632 not found: ID does not exist" containerID="2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632" Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.210094 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632"} err="failed to get container status \"2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632\": rpc error: code = NotFound desc = could not find container \"2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632\": container with ID starting with 2e7a6cac8c1a5452821140aa66803948d02b7e424fbd9e7d0d6389ce096f5632 not found: ID does not exist" Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.210122 4974 scope.go:117] "RemoveContainer" containerID="1959200caf62339c8b53c749f3ed52060b441b25f5958e04d836827342f78f22" Oct 13 19:11:02 crc kubenswrapper[4974]: E1013 19:11:02.210500 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1959200caf62339c8b53c749f3ed52060b441b25f5958e04d836827342f78f22\": container with ID starting with 1959200caf62339c8b53c749f3ed52060b441b25f5958e04d836827342f78f22 not found: ID does not exist" containerID="1959200caf62339c8b53c749f3ed52060b441b25f5958e04d836827342f78f22" Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.210533 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1959200caf62339c8b53c749f3ed52060b441b25f5958e04d836827342f78f22"} err="failed to get container status \"1959200caf62339c8b53c749f3ed52060b441b25f5958e04d836827342f78f22\": rpc error: code = NotFound desc = could not find container \"1959200caf62339c8b53c749f3ed52060b441b25f5958e04d836827342f78f22\": container with ID starting with 1959200caf62339c8b53c749f3ed52060b441b25f5958e04d836827342f78f22 not found: ID does not exist" Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.210553 4974 scope.go:117] "RemoveContainer" containerID="228e1ec29c3908955c5827877aba7f85b14a5ede26530df9c1d07111f6921305" Oct 13 19:11:02 crc kubenswrapper[4974]: E1013 19:11:02.210997 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"228e1ec29c3908955c5827877aba7f85b14a5ede26530df9c1d07111f6921305\": container with ID starting with 228e1ec29c3908955c5827877aba7f85b14a5ede26530df9c1d07111f6921305 not found: ID does not exist" containerID="228e1ec29c3908955c5827877aba7f85b14a5ede26530df9c1d07111f6921305" Oct 13 19:11:02 crc kubenswrapper[4974]: I1013 19:11:02.211025 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"228e1ec29c3908955c5827877aba7f85b14a5ede26530df9c1d07111f6921305"} err="failed to get container status \"228e1ec29c3908955c5827877aba7f85b14a5ede26530df9c1d07111f6921305\": rpc error: code = NotFound desc = could not find container \"228e1ec29c3908955c5827877aba7f85b14a5ede26530df9c1d07111f6921305\": container with ID starting with 228e1ec29c3908955c5827877aba7f85b14a5ede26530df9c1d07111f6921305 not found: ID does not exist" Oct 13 19:11:03 crc kubenswrapper[4974]: I1013 19:11:03.836195 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="822e6927-6f86-42e9-9d21-667b687dc274" path="/var/lib/kubelet/pods/822e6927-6f86-42e9-9d21-667b687dc274/volumes" Oct 13 19:11:04 crc kubenswrapper[4974]: I1013 19:11:04.812719 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:11:04 crc kubenswrapper[4974]: E1013 19:11:04.813585 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:11:18 crc kubenswrapper[4974]: I1013 19:11:18.812871 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:11:18 crc kubenswrapper[4974]: E1013 19:11:18.815485 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:11:29 crc kubenswrapper[4974]: I1013 19:11:29.811846 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:11:29 crc kubenswrapper[4974]: E1013 19:11:29.812691 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:11:41 crc kubenswrapper[4974]: I1013 19:11:41.811631 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:11:42 crc kubenswrapper[4974]: I1013 19:11:42.550379 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"3fce7d19a1fdbd6d85265bda51db42eafae54a3c1461bf9f15a6d4f3726731c8"} Oct 13 19:11:47 crc kubenswrapper[4974]: E1013 19:11:47.426959 4974 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.30:57284->38.102.83.30:44205: write tcp 38.102.83.30:57284->38.102.83.30:44205: write: broken pipe Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.011926 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pms4w"] Oct 13 19:12:18 crc kubenswrapper[4974]: E1013 19:12:18.013020 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822e6927-6f86-42e9-9d21-667b687dc274" containerName="registry-server" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.013039 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="822e6927-6f86-42e9-9d21-667b687dc274" containerName="registry-server" Oct 13 19:12:18 crc kubenswrapper[4974]: E1013 19:12:18.013073 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822e6927-6f86-42e9-9d21-667b687dc274" containerName="extract-utilities" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.013084 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="822e6927-6f86-42e9-9d21-667b687dc274" containerName="extract-utilities" Oct 13 19:12:18 crc kubenswrapper[4974]: E1013 19:12:18.013102 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="822e6927-6f86-42e9-9d21-667b687dc274" containerName="extract-content" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.013109 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="822e6927-6f86-42e9-9d21-667b687dc274" containerName="extract-content" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.013343 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="822e6927-6f86-42e9-9d21-667b687dc274" containerName="registry-server" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.015317 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.046387 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pms4w"] Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.106807 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp4k4\" (UniqueName: \"kubernetes.io/projected/b55d81e5-a511-47cb-872a-c65e61cd7f6a-kube-api-access-dp4k4\") pod \"community-operators-pms4w\" (UID: \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\") " pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.106852 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d81e5-a511-47cb-872a-c65e61cd7f6a-catalog-content\") pod \"community-operators-pms4w\" (UID: \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\") " pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.107002 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d81e5-a511-47cb-872a-c65e61cd7f6a-utilities\") pod \"community-operators-pms4w\" (UID: \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\") " pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.211280 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp4k4\" (UniqueName: \"kubernetes.io/projected/b55d81e5-a511-47cb-872a-c65e61cd7f6a-kube-api-access-dp4k4\") pod \"community-operators-pms4w\" (UID: \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\") " pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.211368 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d81e5-a511-47cb-872a-c65e61cd7f6a-catalog-content\") pod \"community-operators-pms4w\" (UID: \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\") " pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.211755 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d81e5-a511-47cb-872a-c65e61cd7f6a-utilities\") pod \"community-operators-pms4w\" (UID: \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\") " pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.212473 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d81e5-a511-47cb-872a-c65e61cd7f6a-utilities\") pod \"community-operators-pms4w\" (UID: \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\") " pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.212575 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d81e5-a511-47cb-872a-c65e61cd7f6a-catalog-content\") pod \"community-operators-pms4w\" (UID: \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\") " pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.238836 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp4k4\" (UniqueName: \"kubernetes.io/projected/b55d81e5-a511-47cb-872a-c65e61cd7f6a-kube-api-access-dp4k4\") pod \"community-operators-pms4w\" (UID: \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\") " pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.343063 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.856150 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pms4w"] Oct 13 19:12:18 crc kubenswrapper[4974]: I1013 19:12:18.984642 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pms4w" event={"ID":"b55d81e5-a511-47cb-872a-c65e61cd7f6a","Type":"ContainerStarted","Data":"2c0e7285b13ce82afc638a27593c552f5544b5d9e331f31c672eb999417809fb"} Oct 13 19:12:19 crc kubenswrapper[4974]: I1013 19:12:19.998427 4974 generic.go:334] "Generic (PLEG): container finished" podID="b55d81e5-a511-47cb-872a-c65e61cd7f6a" containerID="85c0546b5c5b41eb209b20289790e1754831d75db45ffe137a7232fd0d7d8d50" exitCode=0 Oct 13 19:12:19 crc kubenswrapper[4974]: I1013 19:12:19.998522 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pms4w" event={"ID":"b55d81e5-a511-47cb-872a-c65e61cd7f6a","Type":"ContainerDied","Data":"85c0546b5c5b41eb209b20289790e1754831d75db45ffe137a7232fd0d7d8d50"} Oct 13 19:12:22 crc kubenswrapper[4974]: I1013 19:12:22.022518 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pms4w" event={"ID":"b55d81e5-a511-47cb-872a-c65e61cd7f6a","Type":"ContainerStarted","Data":"aa5d3cdbc852925e24d45a563f593bac130ee0c3015b9ce2067a9a57e5b9feb6"} Oct 13 19:12:23 crc kubenswrapper[4974]: I1013 19:12:23.031952 4974 generic.go:334] "Generic (PLEG): container finished" podID="b55d81e5-a511-47cb-872a-c65e61cd7f6a" containerID="aa5d3cdbc852925e24d45a563f593bac130ee0c3015b9ce2067a9a57e5b9feb6" exitCode=0 Oct 13 19:12:23 crc kubenswrapper[4974]: I1013 19:12:23.031992 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pms4w" event={"ID":"b55d81e5-a511-47cb-872a-c65e61cd7f6a","Type":"ContainerDied","Data":"aa5d3cdbc852925e24d45a563f593bac130ee0c3015b9ce2067a9a57e5b9feb6"} Oct 13 19:12:24 crc kubenswrapper[4974]: I1013 19:12:24.047840 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pms4w" event={"ID":"b55d81e5-a511-47cb-872a-c65e61cd7f6a","Type":"ContainerStarted","Data":"c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1"} Oct 13 19:12:24 crc kubenswrapper[4974]: I1013 19:12:24.086688 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pms4w" podStartSLOduration=3.55525129 podStartE2EDuration="7.086640859s" podCreationTimestamp="2025-10-13 19:12:17 +0000 UTC" firstStartedPulling="2025-10-13 19:12:20.001246347 +0000 UTC m=+3474.905612437" lastFinishedPulling="2025-10-13 19:12:23.532635916 +0000 UTC m=+3478.437002006" observedRunningTime="2025-10-13 19:12:24.07816593 +0000 UTC m=+3478.982532040" watchObservedRunningTime="2025-10-13 19:12:24.086640859 +0000 UTC m=+3478.991006949" Oct 13 19:12:28 crc kubenswrapper[4974]: I1013 19:12:28.343557 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:28 crc kubenswrapper[4974]: I1013 19:12:28.344251 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:28 crc kubenswrapper[4974]: I1013 19:12:28.411881 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:29 crc kubenswrapper[4974]: I1013 19:12:29.177742 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:29 crc kubenswrapper[4974]: I1013 19:12:29.234792 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pms4w"] Oct 13 19:12:31 crc kubenswrapper[4974]: I1013 19:12:31.124681 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-pms4w" podUID="b55d81e5-a511-47cb-872a-c65e61cd7f6a" containerName="registry-server" containerID="cri-o://c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1" gracePeriod=2 Oct 13 19:12:31 crc kubenswrapper[4974]: I1013 19:12:31.564494 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:31 crc kubenswrapper[4974]: I1013 19:12:31.624551 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d81e5-a511-47cb-872a-c65e61cd7f6a-catalog-content\") pod \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\" (UID: \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\") " Oct 13 19:12:31 crc kubenswrapper[4974]: I1013 19:12:31.624617 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d81e5-a511-47cb-872a-c65e61cd7f6a-utilities\") pod \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\" (UID: \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\") " Oct 13 19:12:31 crc kubenswrapper[4974]: I1013 19:12:31.624736 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp4k4\" (UniqueName: \"kubernetes.io/projected/b55d81e5-a511-47cb-872a-c65e61cd7f6a-kube-api-access-dp4k4\") pod \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\" (UID: \"b55d81e5-a511-47cb-872a-c65e61cd7f6a\") " Oct 13 19:12:31 crc kubenswrapper[4974]: I1013 19:12:31.628298 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b55d81e5-a511-47cb-872a-c65e61cd7f6a-utilities" (OuterVolumeSpecName: "utilities") pod "b55d81e5-a511-47cb-872a-c65e61cd7f6a" (UID: "b55d81e5-a511-47cb-872a-c65e61cd7f6a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:12:31 crc kubenswrapper[4974]: I1013 19:12:31.632009 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b55d81e5-a511-47cb-872a-c65e61cd7f6a-kube-api-access-dp4k4" (OuterVolumeSpecName: "kube-api-access-dp4k4") pod "b55d81e5-a511-47cb-872a-c65e61cd7f6a" (UID: "b55d81e5-a511-47cb-872a-c65e61cd7f6a"). InnerVolumeSpecName "kube-api-access-dp4k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:12:31 crc kubenswrapper[4974]: I1013 19:12:31.676130 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b55d81e5-a511-47cb-872a-c65e61cd7f6a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b55d81e5-a511-47cb-872a-c65e61cd7f6a" (UID: "b55d81e5-a511-47cb-872a-c65e61cd7f6a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:12:31 crc kubenswrapper[4974]: I1013 19:12:31.727026 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b55d81e5-a511-47cb-872a-c65e61cd7f6a-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:12:31 crc kubenswrapper[4974]: I1013 19:12:31.727057 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp4k4\" (UniqueName: \"kubernetes.io/projected/b55d81e5-a511-47cb-872a-c65e61cd7f6a-kube-api-access-dp4k4\") on node \"crc\" DevicePath \"\"" Oct 13 19:12:31 crc kubenswrapper[4974]: I1013 19:12:31.727067 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b55d81e5-a511-47cb-872a-c65e61cd7f6a-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.139827 4974 generic.go:334] "Generic (PLEG): container finished" podID="b55d81e5-a511-47cb-872a-c65e61cd7f6a" containerID="c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1" exitCode=0 Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.139925 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pms4w" event={"ID":"b55d81e5-a511-47cb-872a-c65e61cd7f6a","Type":"ContainerDied","Data":"c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1"} Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.140222 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pms4w" event={"ID":"b55d81e5-a511-47cb-872a-c65e61cd7f6a","Type":"ContainerDied","Data":"2c0e7285b13ce82afc638a27593c552f5544b5d9e331f31c672eb999417809fb"} Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.140250 4974 scope.go:117] "RemoveContainer" containerID="c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1" Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.140010 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pms4w" Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.175629 4974 scope.go:117] "RemoveContainer" containerID="aa5d3cdbc852925e24d45a563f593bac130ee0c3015b9ce2067a9a57e5b9feb6" Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.183992 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-pms4w"] Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.195336 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-pms4w"] Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.226092 4974 scope.go:117] "RemoveContainer" containerID="85c0546b5c5b41eb209b20289790e1754831d75db45ffe137a7232fd0d7d8d50" Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.261065 4974 scope.go:117] "RemoveContainer" containerID="c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1" Oct 13 19:12:32 crc kubenswrapper[4974]: E1013 19:12:32.262629 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1\": container with ID starting with c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1 not found: ID does not exist" containerID="c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1" Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.262697 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1"} err="failed to get container status \"c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1\": rpc error: code = NotFound desc = could not find container \"c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1\": container with ID starting with c528c9547d7286eca468a08fc927457feeafb5897df9f485ad5144cae4df35a1 not found: ID does not exist" Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.262730 4974 scope.go:117] "RemoveContainer" containerID="aa5d3cdbc852925e24d45a563f593bac130ee0c3015b9ce2067a9a57e5b9feb6" Oct 13 19:12:32 crc kubenswrapper[4974]: E1013 19:12:32.263392 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5d3cdbc852925e24d45a563f593bac130ee0c3015b9ce2067a9a57e5b9feb6\": container with ID starting with aa5d3cdbc852925e24d45a563f593bac130ee0c3015b9ce2067a9a57e5b9feb6 not found: ID does not exist" containerID="aa5d3cdbc852925e24d45a563f593bac130ee0c3015b9ce2067a9a57e5b9feb6" Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.263438 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5d3cdbc852925e24d45a563f593bac130ee0c3015b9ce2067a9a57e5b9feb6"} err="failed to get container status \"aa5d3cdbc852925e24d45a563f593bac130ee0c3015b9ce2067a9a57e5b9feb6\": rpc error: code = NotFound desc = could not find container \"aa5d3cdbc852925e24d45a563f593bac130ee0c3015b9ce2067a9a57e5b9feb6\": container with ID starting with aa5d3cdbc852925e24d45a563f593bac130ee0c3015b9ce2067a9a57e5b9feb6 not found: ID does not exist" Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.263466 4974 scope.go:117] "RemoveContainer" containerID="85c0546b5c5b41eb209b20289790e1754831d75db45ffe137a7232fd0d7d8d50" Oct 13 19:12:32 crc kubenswrapper[4974]: E1013 19:12:32.263823 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85c0546b5c5b41eb209b20289790e1754831d75db45ffe137a7232fd0d7d8d50\": container with ID starting with 85c0546b5c5b41eb209b20289790e1754831d75db45ffe137a7232fd0d7d8d50 not found: ID does not exist" containerID="85c0546b5c5b41eb209b20289790e1754831d75db45ffe137a7232fd0d7d8d50" Oct 13 19:12:32 crc kubenswrapper[4974]: I1013 19:12:32.263859 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85c0546b5c5b41eb209b20289790e1754831d75db45ffe137a7232fd0d7d8d50"} err="failed to get container status \"85c0546b5c5b41eb209b20289790e1754831d75db45ffe137a7232fd0d7d8d50\": rpc error: code = NotFound desc = could not find container \"85c0546b5c5b41eb209b20289790e1754831d75db45ffe137a7232fd0d7d8d50\": container with ID starting with 85c0546b5c5b41eb209b20289790e1754831d75db45ffe137a7232fd0d7d8d50 not found: ID does not exist" Oct 13 19:12:33 crc kubenswrapper[4974]: I1013 19:12:33.825994 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b55d81e5-a511-47cb-872a-c65e61cd7f6a" path="/var/lib/kubelet/pods/b55d81e5-a511-47cb-872a-c65e61cd7f6a/volumes" Oct 13 19:14:07 crc kubenswrapper[4974]: I1013 19:14:07.742763 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:14:07 crc kubenswrapper[4974]: I1013 19:14:07.743690 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:14:37 crc kubenswrapper[4974]: I1013 19:14:37.742824 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:14:37 crc kubenswrapper[4974]: I1013 19:14:37.743558 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.210597 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc"] Oct 13 19:15:00 crc kubenswrapper[4974]: E1013 19:15:00.212720 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55d81e5-a511-47cb-872a-c65e61cd7f6a" containerName="extract-content" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.212801 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55d81e5-a511-47cb-872a-c65e61cd7f6a" containerName="extract-content" Oct 13 19:15:00 crc kubenswrapper[4974]: E1013 19:15:00.212860 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55d81e5-a511-47cb-872a-c65e61cd7f6a" containerName="extract-utilities" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.212911 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55d81e5-a511-47cb-872a-c65e61cd7f6a" containerName="extract-utilities" Oct 13 19:15:00 crc kubenswrapper[4974]: E1013 19:15:00.212997 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55d81e5-a511-47cb-872a-c65e61cd7f6a" containerName="registry-server" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.213043 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55d81e5-a511-47cb-872a-c65e61cd7f6a" containerName="registry-server" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.213300 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="b55d81e5-a511-47cb-872a-c65e61cd7f6a" containerName="registry-server" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.214107 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.217187 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.221109 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc"] Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.222403 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.261916 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7733a86d-f657-47de-a8a6-50328f3a9392-config-volume\") pod \"collect-profiles-29339715-zvdxc\" (UID: \"7733a86d-f657-47de-a8a6-50328f3a9392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.262038 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpsgf\" (UniqueName: \"kubernetes.io/projected/7733a86d-f657-47de-a8a6-50328f3a9392-kube-api-access-gpsgf\") pod \"collect-profiles-29339715-zvdxc\" (UID: \"7733a86d-f657-47de-a8a6-50328f3a9392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.262104 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7733a86d-f657-47de-a8a6-50328f3a9392-secret-volume\") pod \"collect-profiles-29339715-zvdxc\" (UID: \"7733a86d-f657-47de-a8a6-50328f3a9392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.364825 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpsgf\" (UniqueName: \"kubernetes.io/projected/7733a86d-f657-47de-a8a6-50328f3a9392-kube-api-access-gpsgf\") pod \"collect-profiles-29339715-zvdxc\" (UID: \"7733a86d-f657-47de-a8a6-50328f3a9392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.365088 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7733a86d-f657-47de-a8a6-50328f3a9392-secret-volume\") pod \"collect-profiles-29339715-zvdxc\" (UID: \"7733a86d-f657-47de-a8a6-50328f3a9392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.365281 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7733a86d-f657-47de-a8a6-50328f3a9392-config-volume\") pod \"collect-profiles-29339715-zvdxc\" (UID: \"7733a86d-f657-47de-a8a6-50328f3a9392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.366420 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7733a86d-f657-47de-a8a6-50328f3a9392-config-volume\") pod \"collect-profiles-29339715-zvdxc\" (UID: \"7733a86d-f657-47de-a8a6-50328f3a9392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.375858 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7733a86d-f657-47de-a8a6-50328f3a9392-secret-volume\") pod \"collect-profiles-29339715-zvdxc\" (UID: \"7733a86d-f657-47de-a8a6-50328f3a9392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.398785 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpsgf\" (UniqueName: \"kubernetes.io/projected/7733a86d-f657-47de-a8a6-50328f3a9392-kube-api-access-gpsgf\") pod \"collect-profiles-29339715-zvdxc\" (UID: \"7733a86d-f657-47de-a8a6-50328f3a9392\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.531292 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.843509 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc"] Oct 13 19:15:00 crc kubenswrapper[4974]: I1013 19:15:00.924309 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" event={"ID":"7733a86d-f657-47de-a8a6-50328f3a9392","Type":"ContainerStarted","Data":"1ea85587cd056a148f23139941eb500cf5f558e603fd348c3c6721e020c98ebc"} Oct 13 19:15:01 crc kubenswrapper[4974]: I1013 19:15:01.937948 4974 generic.go:334] "Generic (PLEG): container finished" podID="7733a86d-f657-47de-a8a6-50328f3a9392" containerID="eab290ffa0f05f430f550444a0ffac86a4fe7888eb35c4c80f1ced0471ecfbc8" exitCode=0 Oct 13 19:15:01 crc kubenswrapper[4974]: I1013 19:15:01.938036 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" event={"ID":"7733a86d-f657-47de-a8a6-50328f3a9392","Type":"ContainerDied","Data":"eab290ffa0f05f430f550444a0ffac86a4fe7888eb35c4c80f1ced0471ecfbc8"} Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.365639 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.438480 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7733a86d-f657-47de-a8a6-50328f3a9392-config-volume\") pod \"7733a86d-f657-47de-a8a6-50328f3a9392\" (UID: \"7733a86d-f657-47de-a8a6-50328f3a9392\") " Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.438541 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7733a86d-f657-47de-a8a6-50328f3a9392-secret-volume\") pod \"7733a86d-f657-47de-a8a6-50328f3a9392\" (UID: \"7733a86d-f657-47de-a8a6-50328f3a9392\") " Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.438595 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpsgf\" (UniqueName: \"kubernetes.io/projected/7733a86d-f657-47de-a8a6-50328f3a9392-kube-api-access-gpsgf\") pod \"7733a86d-f657-47de-a8a6-50328f3a9392\" (UID: \"7733a86d-f657-47de-a8a6-50328f3a9392\") " Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.439385 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7733a86d-f657-47de-a8a6-50328f3a9392-config-volume" (OuterVolumeSpecName: "config-volume") pod "7733a86d-f657-47de-a8a6-50328f3a9392" (UID: "7733a86d-f657-47de-a8a6-50328f3a9392"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.440341 4974 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7733a86d-f657-47de-a8a6-50328f3a9392-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.445093 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7733a86d-f657-47de-a8a6-50328f3a9392-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7733a86d-f657-47de-a8a6-50328f3a9392" (UID: "7733a86d-f657-47de-a8a6-50328f3a9392"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.445953 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7733a86d-f657-47de-a8a6-50328f3a9392-kube-api-access-gpsgf" (OuterVolumeSpecName: "kube-api-access-gpsgf") pod "7733a86d-f657-47de-a8a6-50328f3a9392" (UID: "7733a86d-f657-47de-a8a6-50328f3a9392"). InnerVolumeSpecName "kube-api-access-gpsgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.542227 4974 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7733a86d-f657-47de-a8a6-50328f3a9392-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.542469 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpsgf\" (UniqueName: \"kubernetes.io/projected/7733a86d-f657-47de-a8a6-50328f3a9392-kube-api-access-gpsgf\") on node \"crc\" DevicePath \"\"" Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.986458 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" event={"ID":"7733a86d-f657-47de-a8a6-50328f3a9392","Type":"ContainerDied","Data":"1ea85587cd056a148f23139941eb500cf5f558e603fd348c3c6721e020c98ebc"} Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.987081 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ea85587cd056a148f23139941eb500cf5f558e603fd348c3c6721e020c98ebc" Oct 13 19:15:03 crc kubenswrapper[4974]: I1013 19:15:03.986903 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc" Oct 13 19:15:04 crc kubenswrapper[4974]: I1013 19:15:04.474977 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll"] Oct 13 19:15:04 crc kubenswrapper[4974]: I1013 19:15:04.487341 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339670-f6vll"] Oct 13 19:15:05 crc kubenswrapper[4974]: I1013 19:15:05.839917 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43954b5e-c77a-4ef7-bc58-42fc4f98600b" path="/var/lib/kubelet/pods/43954b5e-c77a-4ef7-bc58-42fc4f98600b/volumes" Oct 13 19:15:07 crc kubenswrapper[4974]: I1013 19:15:07.743134 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:15:07 crc kubenswrapper[4974]: I1013 19:15:07.743211 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:15:07 crc kubenswrapper[4974]: I1013 19:15:07.743266 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 19:15:07 crc kubenswrapper[4974]: I1013 19:15:07.744235 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3fce7d19a1fdbd6d85265bda51db42eafae54a3c1461bf9f15a6d4f3726731c8"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 19:15:07 crc kubenswrapper[4974]: I1013 19:15:07.744318 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://3fce7d19a1fdbd6d85265bda51db42eafae54a3c1461bf9f15a6d4f3726731c8" gracePeriod=600 Oct 13 19:15:08 crc kubenswrapper[4974]: I1013 19:15:08.040835 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="3fce7d19a1fdbd6d85265bda51db42eafae54a3c1461bf9f15a6d4f3726731c8" exitCode=0 Oct 13 19:15:08 crc kubenswrapper[4974]: I1013 19:15:08.040894 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"3fce7d19a1fdbd6d85265bda51db42eafae54a3c1461bf9f15a6d4f3726731c8"} Oct 13 19:15:08 crc kubenswrapper[4974]: I1013 19:15:08.041143 4974 scope.go:117] "RemoveContainer" containerID="7f4e9fedbd6ed5980187ba9a3840e7da79a8bafe550412d9398e8342eaa6113b" Oct 13 19:15:09 crc kubenswrapper[4974]: I1013 19:15:09.057776 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237"} Oct 13 19:15:50 crc kubenswrapper[4974]: I1013 19:15:50.640366 4974 scope.go:117] "RemoveContainer" containerID="0e6dfd0bcd19d62a6ae1cced2370558f6aa13aedb14c46a80038429e82cca7e3" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.025423 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-l8j55"] Oct 13 19:16:07 crc kubenswrapper[4974]: E1013 19:16:07.026874 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7733a86d-f657-47de-a8a6-50328f3a9392" containerName="collect-profiles" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.026897 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="7733a86d-f657-47de-a8a6-50328f3a9392" containerName="collect-profiles" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.027359 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="7733a86d-f657-47de-a8a6-50328f3a9392" containerName="collect-profiles" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.031728 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.049042 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8j55"] Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.204991 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ljbt\" (UniqueName: \"kubernetes.io/projected/9b4c53d7-6d8d-4686-9099-c7febad40b87-kube-api-access-4ljbt\") pod \"redhat-marketplace-l8j55\" (UID: \"9b4c53d7-6d8d-4686-9099-c7febad40b87\") " pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.205157 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b4c53d7-6d8d-4686-9099-c7febad40b87-utilities\") pod \"redhat-marketplace-l8j55\" (UID: \"9b4c53d7-6d8d-4686-9099-c7febad40b87\") " pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.205239 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b4c53d7-6d8d-4686-9099-c7febad40b87-catalog-content\") pod \"redhat-marketplace-l8j55\" (UID: \"9b4c53d7-6d8d-4686-9099-c7febad40b87\") " pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.307370 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ljbt\" (UniqueName: \"kubernetes.io/projected/9b4c53d7-6d8d-4686-9099-c7febad40b87-kube-api-access-4ljbt\") pod \"redhat-marketplace-l8j55\" (UID: \"9b4c53d7-6d8d-4686-9099-c7febad40b87\") " pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.307469 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b4c53d7-6d8d-4686-9099-c7febad40b87-utilities\") pod \"redhat-marketplace-l8j55\" (UID: \"9b4c53d7-6d8d-4686-9099-c7febad40b87\") " pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.307519 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b4c53d7-6d8d-4686-9099-c7febad40b87-catalog-content\") pod \"redhat-marketplace-l8j55\" (UID: \"9b4c53d7-6d8d-4686-9099-c7febad40b87\") " pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.308359 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b4c53d7-6d8d-4686-9099-c7febad40b87-utilities\") pod \"redhat-marketplace-l8j55\" (UID: \"9b4c53d7-6d8d-4686-9099-c7febad40b87\") " pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.308470 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b4c53d7-6d8d-4686-9099-c7febad40b87-catalog-content\") pod \"redhat-marketplace-l8j55\" (UID: \"9b4c53d7-6d8d-4686-9099-c7febad40b87\") " pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.336753 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ljbt\" (UniqueName: \"kubernetes.io/projected/9b4c53d7-6d8d-4686-9099-c7febad40b87-kube-api-access-4ljbt\") pod \"redhat-marketplace-l8j55\" (UID: \"9b4c53d7-6d8d-4686-9099-c7febad40b87\") " pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.368902 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:07 crc kubenswrapper[4974]: I1013 19:16:07.833173 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8j55"] Oct 13 19:16:07 crc kubenswrapper[4974]: W1013 19:16:07.842644 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b4c53d7_6d8d_4686_9099_c7febad40b87.slice/crio-6bde35c4204adf32158c424007ece7230a16300783a290b814cf8ad2d9f94a22 WatchSource:0}: Error finding container 6bde35c4204adf32158c424007ece7230a16300783a290b814cf8ad2d9f94a22: Status 404 returned error can't find the container with id 6bde35c4204adf32158c424007ece7230a16300783a290b814cf8ad2d9f94a22 Oct 13 19:16:08 crc kubenswrapper[4974]: I1013 19:16:08.819249 4974 generic.go:334] "Generic (PLEG): container finished" podID="9b4c53d7-6d8d-4686-9099-c7febad40b87" containerID="5856e16f470a6ef54da0bba5256fcf5fbfb7eee289489721657d595a02afb319" exitCode=0 Oct 13 19:16:08 crc kubenswrapper[4974]: I1013 19:16:08.819360 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8j55" event={"ID":"9b4c53d7-6d8d-4686-9099-c7febad40b87","Type":"ContainerDied","Data":"5856e16f470a6ef54da0bba5256fcf5fbfb7eee289489721657d595a02afb319"} Oct 13 19:16:08 crc kubenswrapper[4974]: I1013 19:16:08.819735 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8j55" event={"ID":"9b4c53d7-6d8d-4686-9099-c7febad40b87","Type":"ContainerStarted","Data":"6bde35c4204adf32158c424007ece7230a16300783a290b814cf8ad2d9f94a22"} Oct 13 19:16:08 crc kubenswrapper[4974]: I1013 19:16:08.823473 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.217012 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qzdw8"] Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.223501 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.232695 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qzdw8"] Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.355826 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce03382b-d6c9-4a8c-8e70-47bbf6906187-utilities\") pod \"redhat-operators-qzdw8\" (UID: \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\") " pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.356018 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce03382b-d6c9-4a8c-8e70-47bbf6906187-catalog-content\") pod \"redhat-operators-qzdw8\" (UID: \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\") " pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.356079 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mg24\" (UniqueName: \"kubernetes.io/projected/ce03382b-d6c9-4a8c-8e70-47bbf6906187-kube-api-access-2mg24\") pod \"redhat-operators-qzdw8\" (UID: \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\") " pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.458053 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce03382b-d6c9-4a8c-8e70-47bbf6906187-utilities\") pod \"redhat-operators-qzdw8\" (UID: \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\") " pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.458221 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce03382b-d6c9-4a8c-8e70-47bbf6906187-catalog-content\") pod \"redhat-operators-qzdw8\" (UID: \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\") " pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.458287 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mg24\" (UniqueName: \"kubernetes.io/projected/ce03382b-d6c9-4a8c-8e70-47bbf6906187-kube-api-access-2mg24\") pod \"redhat-operators-qzdw8\" (UID: \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\") " pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.458641 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce03382b-d6c9-4a8c-8e70-47bbf6906187-utilities\") pod \"redhat-operators-qzdw8\" (UID: \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\") " pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.458674 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce03382b-d6c9-4a8c-8e70-47bbf6906187-catalog-content\") pod \"redhat-operators-qzdw8\" (UID: \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\") " pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.479369 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mg24\" (UniqueName: \"kubernetes.io/projected/ce03382b-d6c9-4a8c-8e70-47bbf6906187-kube-api-access-2mg24\") pod \"redhat-operators-qzdw8\" (UID: \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\") " pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:09 crc kubenswrapper[4974]: I1013 19:16:09.560291 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:10 crc kubenswrapper[4974]: I1013 19:16:10.103276 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qzdw8"] Oct 13 19:16:10 crc kubenswrapper[4974]: I1013 19:16:10.860233 4974 generic.go:334] "Generic (PLEG): container finished" podID="9b4c53d7-6d8d-4686-9099-c7febad40b87" containerID="23aeea4c143d5a4b2e32d4ed57026b6a0285816ee2ffadd50f8403c64a07b530" exitCode=0 Oct 13 19:16:10 crc kubenswrapper[4974]: I1013 19:16:10.860420 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8j55" event={"ID":"9b4c53d7-6d8d-4686-9099-c7febad40b87","Type":"ContainerDied","Data":"23aeea4c143d5a4b2e32d4ed57026b6a0285816ee2ffadd50f8403c64a07b530"} Oct 13 19:16:10 crc kubenswrapper[4974]: I1013 19:16:10.863684 4974 generic.go:334] "Generic (PLEG): container finished" podID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" containerID="9093e764b5eb3d219267aef8bdc8499217dd8d2a283fcf9cdfdbdd4949129109" exitCode=0 Oct 13 19:16:10 crc kubenswrapper[4974]: I1013 19:16:10.863758 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzdw8" event={"ID":"ce03382b-d6c9-4a8c-8e70-47bbf6906187","Type":"ContainerDied","Data":"9093e764b5eb3d219267aef8bdc8499217dd8d2a283fcf9cdfdbdd4949129109"} Oct 13 19:16:10 crc kubenswrapper[4974]: I1013 19:16:10.864084 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzdw8" event={"ID":"ce03382b-d6c9-4a8c-8e70-47bbf6906187","Type":"ContainerStarted","Data":"6062ea041db08dfc286a02fd35431ff58ab69806afad375a14b477c3c7bcd760"} Oct 13 19:16:11 crc kubenswrapper[4974]: I1013 19:16:11.881985 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8j55" event={"ID":"9b4c53d7-6d8d-4686-9099-c7febad40b87","Type":"ContainerStarted","Data":"9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090"} Oct 13 19:16:11 crc kubenswrapper[4974]: I1013 19:16:11.910166 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-l8j55" podStartSLOduration=3.465609792 podStartE2EDuration="5.910146395s" podCreationTimestamp="2025-10-13 19:16:06 +0000 UTC" firstStartedPulling="2025-10-13 19:16:08.823067127 +0000 UTC m=+3703.727433247" lastFinishedPulling="2025-10-13 19:16:11.26760376 +0000 UTC m=+3706.171969850" observedRunningTime="2025-10-13 19:16:11.907000117 +0000 UTC m=+3706.811366237" watchObservedRunningTime="2025-10-13 19:16:11.910146395 +0000 UTC m=+3706.814512475" Oct 13 19:16:12 crc kubenswrapper[4974]: I1013 19:16:12.892197 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzdw8" event={"ID":"ce03382b-d6c9-4a8c-8e70-47bbf6906187","Type":"ContainerStarted","Data":"44ae24bfece8433219466d328384893d16627eb86adcf93c57fca28b87cd71b4"} Oct 13 19:16:15 crc kubenswrapper[4974]: I1013 19:16:15.935584 4974 generic.go:334] "Generic (PLEG): container finished" podID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" containerID="44ae24bfece8433219466d328384893d16627eb86adcf93c57fca28b87cd71b4" exitCode=0 Oct 13 19:16:15 crc kubenswrapper[4974]: I1013 19:16:15.935709 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzdw8" event={"ID":"ce03382b-d6c9-4a8c-8e70-47bbf6906187","Type":"ContainerDied","Data":"44ae24bfece8433219466d328384893d16627eb86adcf93c57fca28b87cd71b4"} Oct 13 19:16:16 crc kubenswrapper[4974]: I1013 19:16:16.971192 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzdw8" event={"ID":"ce03382b-d6c9-4a8c-8e70-47bbf6906187","Type":"ContainerStarted","Data":"6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581"} Oct 13 19:16:16 crc kubenswrapper[4974]: I1013 19:16:16.997339 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qzdw8" podStartSLOduration=2.385639484 podStartE2EDuration="7.99731214s" podCreationTimestamp="2025-10-13 19:16:09 +0000 UTC" firstStartedPulling="2025-10-13 19:16:10.866193125 +0000 UTC m=+3705.770559245" lastFinishedPulling="2025-10-13 19:16:16.477865791 +0000 UTC m=+3711.382231901" observedRunningTime="2025-10-13 19:16:16.992254108 +0000 UTC m=+3711.896620198" watchObservedRunningTime="2025-10-13 19:16:16.99731214 +0000 UTC m=+3711.901678250" Oct 13 19:16:17 crc kubenswrapper[4974]: I1013 19:16:17.369094 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:17 crc kubenswrapper[4974]: I1013 19:16:17.369164 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:17 crc kubenswrapper[4974]: I1013 19:16:17.472352 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:18 crc kubenswrapper[4974]: I1013 19:16:18.067862 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:18 crc kubenswrapper[4974]: I1013 19:16:18.816014 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8j55"] Oct 13 19:16:19 crc kubenswrapper[4974]: I1013 19:16:19.561394 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:19 crc kubenswrapper[4974]: I1013 19:16:19.561432 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:20 crc kubenswrapper[4974]: I1013 19:16:20.002502 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-l8j55" podUID="9b4c53d7-6d8d-4686-9099-c7febad40b87" containerName="registry-server" containerID="cri-o://9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090" gracePeriod=2 Oct 13 19:16:20 crc kubenswrapper[4974]: I1013 19:16:20.532698 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:20 crc kubenswrapper[4974]: I1013 19:16:20.628233 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qzdw8" podUID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" containerName="registry-server" probeResult="failure" output=< Oct 13 19:16:20 crc kubenswrapper[4974]: timeout: failed to connect service ":50051" within 1s Oct 13 19:16:20 crc kubenswrapper[4974]: > Oct 13 19:16:20 crc kubenswrapper[4974]: I1013 19:16:20.637214 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b4c53d7-6d8d-4686-9099-c7febad40b87-utilities\") pod \"9b4c53d7-6d8d-4686-9099-c7febad40b87\" (UID: \"9b4c53d7-6d8d-4686-9099-c7febad40b87\") " Oct 13 19:16:20 crc kubenswrapper[4974]: I1013 19:16:20.637323 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ljbt\" (UniqueName: \"kubernetes.io/projected/9b4c53d7-6d8d-4686-9099-c7febad40b87-kube-api-access-4ljbt\") pod \"9b4c53d7-6d8d-4686-9099-c7febad40b87\" (UID: \"9b4c53d7-6d8d-4686-9099-c7febad40b87\") " Oct 13 19:16:20 crc kubenswrapper[4974]: I1013 19:16:20.637602 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b4c53d7-6d8d-4686-9099-c7febad40b87-catalog-content\") pod \"9b4c53d7-6d8d-4686-9099-c7febad40b87\" (UID: \"9b4c53d7-6d8d-4686-9099-c7febad40b87\") " Oct 13 19:16:20 crc kubenswrapper[4974]: I1013 19:16:20.638499 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4c53d7-6d8d-4686-9099-c7febad40b87-utilities" (OuterVolumeSpecName: "utilities") pod "9b4c53d7-6d8d-4686-9099-c7febad40b87" (UID: "9b4c53d7-6d8d-4686-9099-c7febad40b87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:16:20 crc kubenswrapper[4974]: I1013 19:16:20.646992 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4c53d7-6d8d-4686-9099-c7febad40b87-kube-api-access-4ljbt" (OuterVolumeSpecName: "kube-api-access-4ljbt") pod "9b4c53d7-6d8d-4686-9099-c7febad40b87" (UID: "9b4c53d7-6d8d-4686-9099-c7febad40b87"). InnerVolumeSpecName "kube-api-access-4ljbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:16:20 crc kubenswrapper[4974]: I1013 19:16:20.660727 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4c53d7-6d8d-4686-9099-c7febad40b87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9b4c53d7-6d8d-4686-9099-c7febad40b87" (UID: "9b4c53d7-6d8d-4686-9099-c7febad40b87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:16:20 crc kubenswrapper[4974]: I1013 19:16:20.741411 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9b4c53d7-6d8d-4686-9099-c7febad40b87-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:16:20 crc kubenswrapper[4974]: I1013 19:16:20.741731 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9b4c53d7-6d8d-4686-9099-c7febad40b87-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:16:20 crc kubenswrapper[4974]: I1013 19:16:20.741879 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ljbt\" (UniqueName: \"kubernetes.io/projected/9b4c53d7-6d8d-4686-9099-c7febad40b87-kube-api-access-4ljbt\") on node \"crc\" DevicePath \"\"" Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.017068 4974 generic.go:334] "Generic (PLEG): container finished" podID="9b4c53d7-6d8d-4686-9099-c7febad40b87" containerID="9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090" exitCode=0 Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.017130 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-l8j55" Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.017149 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8j55" event={"ID":"9b4c53d7-6d8d-4686-9099-c7febad40b87","Type":"ContainerDied","Data":"9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090"} Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.017195 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-l8j55" event={"ID":"9b4c53d7-6d8d-4686-9099-c7febad40b87","Type":"ContainerDied","Data":"6bde35c4204adf32158c424007ece7230a16300783a290b814cf8ad2d9f94a22"} Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.017217 4974 scope.go:117] "RemoveContainer" containerID="9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090" Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.057709 4974 scope.go:117] "RemoveContainer" containerID="23aeea4c143d5a4b2e32d4ed57026b6a0285816ee2ffadd50f8403c64a07b530" Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.063024 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8j55"] Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.076106 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-l8j55"] Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.094347 4974 scope.go:117] "RemoveContainer" containerID="5856e16f470a6ef54da0bba5256fcf5fbfb7eee289489721657d595a02afb319" Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.154409 4974 scope.go:117] "RemoveContainer" containerID="9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090" Oct 13 19:16:21 crc kubenswrapper[4974]: E1013 19:16:21.155004 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090\": container with ID starting with 9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090 not found: ID does not exist" containerID="9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090" Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.155041 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090"} err="failed to get container status \"9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090\": rpc error: code = NotFound desc = could not find container \"9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090\": container with ID starting with 9d778bbbed4f9f8f89243effbade0788708e76c6691082e5afd1d9c29ec9c090 not found: ID does not exist" Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.155063 4974 scope.go:117] "RemoveContainer" containerID="23aeea4c143d5a4b2e32d4ed57026b6a0285816ee2ffadd50f8403c64a07b530" Oct 13 19:16:21 crc kubenswrapper[4974]: E1013 19:16:21.155604 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23aeea4c143d5a4b2e32d4ed57026b6a0285816ee2ffadd50f8403c64a07b530\": container with ID starting with 23aeea4c143d5a4b2e32d4ed57026b6a0285816ee2ffadd50f8403c64a07b530 not found: ID does not exist" containerID="23aeea4c143d5a4b2e32d4ed57026b6a0285816ee2ffadd50f8403c64a07b530" Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.155634 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23aeea4c143d5a4b2e32d4ed57026b6a0285816ee2ffadd50f8403c64a07b530"} err="failed to get container status \"23aeea4c143d5a4b2e32d4ed57026b6a0285816ee2ffadd50f8403c64a07b530\": rpc error: code = NotFound desc = could not find container \"23aeea4c143d5a4b2e32d4ed57026b6a0285816ee2ffadd50f8403c64a07b530\": container with ID starting with 23aeea4c143d5a4b2e32d4ed57026b6a0285816ee2ffadd50f8403c64a07b530 not found: ID does not exist" Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.155665 4974 scope.go:117] "RemoveContainer" containerID="5856e16f470a6ef54da0bba5256fcf5fbfb7eee289489721657d595a02afb319" Oct 13 19:16:21 crc kubenswrapper[4974]: E1013 19:16:21.156045 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5856e16f470a6ef54da0bba5256fcf5fbfb7eee289489721657d595a02afb319\": container with ID starting with 5856e16f470a6ef54da0bba5256fcf5fbfb7eee289489721657d595a02afb319 not found: ID does not exist" containerID="5856e16f470a6ef54da0bba5256fcf5fbfb7eee289489721657d595a02afb319" Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.156066 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5856e16f470a6ef54da0bba5256fcf5fbfb7eee289489721657d595a02afb319"} err="failed to get container status \"5856e16f470a6ef54da0bba5256fcf5fbfb7eee289489721657d595a02afb319\": rpc error: code = NotFound desc = could not find container \"5856e16f470a6ef54da0bba5256fcf5fbfb7eee289489721657d595a02afb319\": container with ID starting with 5856e16f470a6ef54da0bba5256fcf5fbfb7eee289489721657d595a02afb319 not found: ID does not exist" Oct 13 19:16:21 crc kubenswrapper[4974]: I1013 19:16:21.833591 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b4c53d7-6d8d-4686-9099-c7febad40b87" path="/var/lib/kubelet/pods/9b4c53d7-6d8d-4686-9099-c7febad40b87/volumes" Oct 13 19:16:29 crc kubenswrapper[4974]: I1013 19:16:29.646760 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:29 crc kubenswrapper[4974]: I1013 19:16:29.717878 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:29 crc kubenswrapper[4974]: I1013 19:16:29.897288 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qzdw8"] Oct 13 19:16:31 crc kubenswrapper[4974]: I1013 19:16:31.167929 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qzdw8" podUID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" containerName="registry-server" containerID="cri-o://6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581" gracePeriod=2 Oct 13 19:16:31 crc kubenswrapper[4974]: I1013 19:16:31.761351 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:31 crc kubenswrapper[4974]: I1013 19:16:31.829962 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mg24\" (UniqueName: \"kubernetes.io/projected/ce03382b-d6c9-4a8c-8e70-47bbf6906187-kube-api-access-2mg24\") pod \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\" (UID: \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\") " Oct 13 19:16:31 crc kubenswrapper[4974]: I1013 19:16:31.830130 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce03382b-d6c9-4a8c-8e70-47bbf6906187-utilities\") pod \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\" (UID: \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\") " Oct 13 19:16:31 crc kubenswrapper[4974]: I1013 19:16:31.830292 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce03382b-d6c9-4a8c-8e70-47bbf6906187-catalog-content\") pod \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\" (UID: \"ce03382b-d6c9-4a8c-8e70-47bbf6906187\") " Oct 13 19:16:31 crc kubenswrapper[4974]: I1013 19:16:31.832426 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce03382b-d6c9-4a8c-8e70-47bbf6906187-utilities" (OuterVolumeSpecName: "utilities") pod "ce03382b-d6c9-4a8c-8e70-47bbf6906187" (UID: "ce03382b-d6c9-4a8c-8e70-47bbf6906187"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:16:31 crc kubenswrapper[4974]: I1013 19:16:31.857139 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce03382b-d6c9-4a8c-8e70-47bbf6906187-kube-api-access-2mg24" (OuterVolumeSpecName: "kube-api-access-2mg24") pod "ce03382b-d6c9-4a8c-8e70-47bbf6906187" (UID: "ce03382b-d6c9-4a8c-8e70-47bbf6906187"). InnerVolumeSpecName "kube-api-access-2mg24". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:16:31 crc kubenswrapper[4974]: I1013 19:16:31.933715 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mg24\" (UniqueName: \"kubernetes.io/projected/ce03382b-d6c9-4a8c-8e70-47bbf6906187-kube-api-access-2mg24\") on node \"crc\" DevicePath \"\"" Oct 13 19:16:31 crc kubenswrapper[4974]: I1013 19:16:31.933972 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce03382b-d6c9-4a8c-8e70-47bbf6906187-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:16:31 crc kubenswrapper[4974]: I1013 19:16:31.942097 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce03382b-d6c9-4a8c-8e70-47bbf6906187-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce03382b-d6c9-4a8c-8e70-47bbf6906187" (UID: "ce03382b-d6c9-4a8c-8e70-47bbf6906187"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.036362 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce03382b-d6c9-4a8c-8e70-47bbf6906187-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.184464 4974 generic.go:334] "Generic (PLEG): container finished" podID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" containerID="6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581" exitCode=0 Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.184541 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzdw8" event={"ID":"ce03382b-d6c9-4a8c-8e70-47bbf6906187","Type":"ContainerDied","Data":"6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581"} Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.184961 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qzdw8" event={"ID":"ce03382b-d6c9-4a8c-8e70-47bbf6906187","Type":"ContainerDied","Data":"6062ea041db08dfc286a02fd35431ff58ab69806afad375a14b477c3c7bcd760"} Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.184567 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qzdw8" Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.184999 4974 scope.go:117] "RemoveContainer" containerID="6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581" Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.213682 4974 scope.go:117] "RemoveContainer" containerID="44ae24bfece8433219466d328384893d16627eb86adcf93c57fca28b87cd71b4" Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.245534 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qzdw8"] Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.254752 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qzdw8"] Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.262978 4974 scope.go:117] "RemoveContainer" containerID="9093e764b5eb3d219267aef8bdc8499217dd8d2a283fcf9cdfdbdd4949129109" Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.291492 4974 scope.go:117] "RemoveContainer" containerID="6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581" Oct 13 19:16:32 crc kubenswrapper[4974]: E1013 19:16:32.291957 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581\": container with ID starting with 6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581 not found: ID does not exist" containerID="6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581" Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.291989 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581"} err="failed to get container status \"6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581\": rpc error: code = NotFound desc = could not find container \"6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581\": container with ID starting with 6806b24101ca956af2fe619e973762be32e666a74d9a881df7eb57979bfac581 not found: ID does not exist" Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.292013 4974 scope.go:117] "RemoveContainer" containerID="44ae24bfece8433219466d328384893d16627eb86adcf93c57fca28b87cd71b4" Oct 13 19:16:32 crc kubenswrapper[4974]: E1013 19:16:32.292289 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ae24bfece8433219466d328384893d16627eb86adcf93c57fca28b87cd71b4\": container with ID starting with 44ae24bfece8433219466d328384893d16627eb86adcf93c57fca28b87cd71b4 not found: ID does not exist" containerID="44ae24bfece8433219466d328384893d16627eb86adcf93c57fca28b87cd71b4" Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.292338 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ae24bfece8433219466d328384893d16627eb86adcf93c57fca28b87cd71b4"} err="failed to get container status \"44ae24bfece8433219466d328384893d16627eb86adcf93c57fca28b87cd71b4\": rpc error: code = NotFound desc = could not find container \"44ae24bfece8433219466d328384893d16627eb86adcf93c57fca28b87cd71b4\": container with ID starting with 44ae24bfece8433219466d328384893d16627eb86adcf93c57fca28b87cd71b4 not found: ID does not exist" Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.292371 4974 scope.go:117] "RemoveContainer" containerID="9093e764b5eb3d219267aef8bdc8499217dd8d2a283fcf9cdfdbdd4949129109" Oct 13 19:16:32 crc kubenswrapper[4974]: E1013 19:16:32.292812 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9093e764b5eb3d219267aef8bdc8499217dd8d2a283fcf9cdfdbdd4949129109\": container with ID starting with 9093e764b5eb3d219267aef8bdc8499217dd8d2a283fcf9cdfdbdd4949129109 not found: ID does not exist" containerID="9093e764b5eb3d219267aef8bdc8499217dd8d2a283fcf9cdfdbdd4949129109" Oct 13 19:16:32 crc kubenswrapper[4974]: I1013 19:16:32.292840 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9093e764b5eb3d219267aef8bdc8499217dd8d2a283fcf9cdfdbdd4949129109"} err="failed to get container status \"9093e764b5eb3d219267aef8bdc8499217dd8d2a283fcf9cdfdbdd4949129109\": rpc error: code = NotFound desc = could not find container \"9093e764b5eb3d219267aef8bdc8499217dd8d2a283fcf9cdfdbdd4949129109\": container with ID starting with 9093e764b5eb3d219267aef8bdc8499217dd8d2a283fcf9cdfdbdd4949129109 not found: ID does not exist" Oct 13 19:16:33 crc kubenswrapper[4974]: I1013 19:16:33.836394 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" path="/var/lib/kubelet/pods/ce03382b-d6c9-4a8c-8e70-47bbf6906187/volumes" Oct 13 19:17:37 crc kubenswrapper[4974]: I1013 19:17:37.743608 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:17:37 crc kubenswrapper[4974]: I1013 19:17:37.744291 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:18:07 crc kubenswrapper[4974]: I1013 19:18:07.742953 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:18:07 crc kubenswrapper[4974]: I1013 19:18:07.743612 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:18:37 crc kubenswrapper[4974]: I1013 19:18:37.743708 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:18:37 crc kubenswrapper[4974]: I1013 19:18:37.744579 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:18:37 crc kubenswrapper[4974]: I1013 19:18:37.744691 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 19:18:37 crc kubenswrapper[4974]: I1013 19:18:37.746247 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 19:18:37 crc kubenswrapper[4974]: I1013 19:18:37.746414 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" gracePeriod=600 Oct 13 19:18:37 crc kubenswrapper[4974]: E1013 19:18:37.877771 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:18:38 crc kubenswrapper[4974]: I1013 19:18:38.595426 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" exitCode=0 Oct 13 19:18:38 crc kubenswrapper[4974]: I1013 19:18:38.595529 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237"} Oct 13 19:18:38 crc kubenswrapper[4974]: I1013 19:18:38.595826 4974 scope.go:117] "RemoveContainer" containerID="3fce7d19a1fdbd6d85265bda51db42eafae54a3c1461bf9f15a6d4f3726731c8" Oct 13 19:18:38 crc kubenswrapper[4974]: I1013 19:18:38.596964 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:18:38 crc kubenswrapper[4974]: E1013 19:18:38.597690 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:18:49 crc kubenswrapper[4974]: I1013 19:18:49.815419 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:18:49 crc kubenswrapper[4974]: E1013 19:18:49.817068 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:19:03 crc kubenswrapper[4974]: I1013 19:19:03.812612 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:19:03 crc kubenswrapper[4974]: E1013 19:19:03.814036 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:19:18 crc kubenswrapper[4974]: I1013 19:19:18.811866 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:19:18 crc kubenswrapper[4974]: E1013 19:19:18.813081 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:19:30 crc kubenswrapper[4974]: I1013 19:19:30.812133 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:19:30 crc kubenswrapper[4974]: E1013 19:19:30.812896 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:19:43 crc kubenswrapper[4974]: I1013 19:19:43.813288 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:19:43 crc kubenswrapper[4974]: E1013 19:19:43.814492 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:19:56 crc kubenswrapper[4974]: I1013 19:19:56.812520 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:19:56 crc kubenswrapper[4974]: E1013 19:19:56.813527 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:20:10 crc kubenswrapper[4974]: I1013 19:20:10.813268 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:20:10 crc kubenswrapper[4974]: E1013 19:20:10.814424 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:20:25 crc kubenswrapper[4974]: I1013 19:20:25.826536 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:20:25 crc kubenswrapper[4974]: E1013 19:20:25.827407 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:20:38 crc kubenswrapper[4974]: I1013 19:20:38.811513 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:20:38 crc kubenswrapper[4974]: E1013 19:20:38.812553 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:20:53 crc kubenswrapper[4974]: I1013 19:20:53.812247 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:20:53 crc kubenswrapper[4974]: E1013 19:20:53.813158 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.774238 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r4mpc"] Oct 13 19:21:02 crc kubenswrapper[4974]: E1013 19:21:02.775312 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" containerName="extract-utilities" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.775332 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" containerName="extract-utilities" Oct 13 19:21:02 crc kubenswrapper[4974]: E1013 19:21:02.775364 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4c53d7-6d8d-4686-9099-c7febad40b87" containerName="extract-content" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.775373 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4c53d7-6d8d-4686-9099-c7febad40b87" containerName="extract-content" Oct 13 19:21:02 crc kubenswrapper[4974]: E1013 19:21:02.775390 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" containerName="registry-server" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.775399 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" containerName="registry-server" Oct 13 19:21:02 crc kubenswrapper[4974]: E1013 19:21:02.775439 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4c53d7-6d8d-4686-9099-c7febad40b87" containerName="extract-utilities" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.775448 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4c53d7-6d8d-4686-9099-c7febad40b87" containerName="extract-utilities" Oct 13 19:21:02 crc kubenswrapper[4974]: E1013 19:21:02.775469 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4c53d7-6d8d-4686-9099-c7febad40b87" containerName="registry-server" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.775477 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4c53d7-6d8d-4686-9099-c7febad40b87" containerName="registry-server" Oct 13 19:21:02 crc kubenswrapper[4974]: E1013 19:21:02.775492 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" containerName="extract-content" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.775500 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" containerName="extract-content" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.775917 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce03382b-d6c9-4a8c-8e70-47bbf6906187" containerName="registry-server" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.775941 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4c53d7-6d8d-4686-9099-c7febad40b87" containerName="registry-server" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.777761 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.789079 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r4mpc"] Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.857027 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0812172-6c40-43af-af1b-1e5a90ae8fe8-utilities\") pod \"certified-operators-r4mpc\" (UID: \"b0812172-6c40-43af-af1b-1e5a90ae8fe8\") " pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.857081 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp9h4\" (UniqueName: \"kubernetes.io/projected/b0812172-6c40-43af-af1b-1e5a90ae8fe8-kube-api-access-wp9h4\") pod \"certified-operators-r4mpc\" (UID: \"b0812172-6c40-43af-af1b-1e5a90ae8fe8\") " pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.857295 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0812172-6c40-43af-af1b-1e5a90ae8fe8-catalog-content\") pod \"certified-operators-r4mpc\" (UID: \"b0812172-6c40-43af-af1b-1e5a90ae8fe8\") " pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.959714 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0812172-6c40-43af-af1b-1e5a90ae8fe8-catalog-content\") pod \"certified-operators-r4mpc\" (UID: \"b0812172-6c40-43af-af1b-1e5a90ae8fe8\") " pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.959835 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0812172-6c40-43af-af1b-1e5a90ae8fe8-utilities\") pod \"certified-operators-r4mpc\" (UID: \"b0812172-6c40-43af-af1b-1e5a90ae8fe8\") " pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.959868 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp9h4\" (UniqueName: \"kubernetes.io/projected/b0812172-6c40-43af-af1b-1e5a90ae8fe8-kube-api-access-wp9h4\") pod \"certified-operators-r4mpc\" (UID: \"b0812172-6c40-43af-af1b-1e5a90ae8fe8\") " pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.962093 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0812172-6c40-43af-af1b-1e5a90ae8fe8-catalog-content\") pod \"certified-operators-r4mpc\" (UID: \"b0812172-6c40-43af-af1b-1e5a90ae8fe8\") " pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.962756 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0812172-6c40-43af-af1b-1e5a90ae8fe8-utilities\") pod \"certified-operators-r4mpc\" (UID: \"b0812172-6c40-43af-af1b-1e5a90ae8fe8\") " pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:02 crc kubenswrapper[4974]: I1013 19:21:02.986491 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp9h4\" (UniqueName: \"kubernetes.io/projected/b0812172-6c40-43af-af1b-1e5a90ae8fe8-kube-api-access-wp9h4\") pod \"certified-operators-r4mpc\" (UID: \"b0812172-6c40-43af-af1b-1e5a90ae8fe8\") " pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:03 crc kubenswrapper[4974]: I1013 19:21:03.105921 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:03 crc kubenswrapper[4974]: I1013 19:21:03.697456 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r4mpc"] Oct 13 19:21:04 crc kubenswrapper[4974]: I1013 19:21:04.440464 4974 generic.go:334] "Generic (PLEG): container finished" podID="b0812172-6c40-43af-af1b-1e5a90ae8fe8" containerID="9f7ddfb9d7741d3d5582c64f01b67163c66cf0d13065f9002e44b0e62d0f7f42" exitCode=0 Oct 13 19:21:04 crc kubenswrapper[4974]: I1013 19:21:04.440739 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4mpc" event={"ID":"b0812172-6c40-43af-af1b-1e5a90ae8fe8","Type":"ContainerDied","Data":"9f7ddfb9d7741d3d5582c64f01b67163c66cf0d13065f9002e44b0e62d0f7f42"} Oct 13 19:21:04 crc kubenswrapper[4974]: I1013 19:21:04.440771 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4mpc" event={"ID":"b0812172-6c40-43af-af1b-1e5a90ae8fe8","Type":"ContainerStarted","Data":"e3a6bbda2fb82c33fb0519d589784b4201bc16c4c5341dc00e3b4ecb6b56d785"} Oct 13 19:21:08 crc kubenswrapper[4974]: I1013 19:21:08.811839 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:21:08 crc kubenswrapper[4974]: E1013 19:21:08.812842 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:21:09 crc kubenswrapper[4974]: I1013 19:21:09.495015 4974 generic.go:334] "Generic (PLEG): container finished" podID="b0812172-6c40-43af-af1b-1e5a90ae8fe8" containerID="926b0b4b42559ec6700d71188cf13521a219670d70eaeef021fba5150be67377" exitCode=0 Oct 13 19:21:09 crc kubenswrapper[4974]: I1013 19:21:09.495090 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4mpc" event={"ID":"b0812172-6c40-43af-af1b-1e5a90ae8fe8","Type":"ContainerDied","Data":"926b0b4b42559ec6700d71188cf13521a219670d70eaeef021fba5150be67377"} Oct 13 19:21:09 crc kubenswrapper[4974]: I1013 19:21:09.499025 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 19:21:10 crc kubenswrapper[4974]: I1013 19:21:10.510007 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r4mpc" event={"ID":"b0812172-6c40-43af-af1b-1e5a90ae8fe8","Type":"ContainerStarted","Data":"795a74ba6ca64cb171883ea46dc444015bbe2518994347343d125440bce89b00"} Oct 13 19:21:10 crc kubenswrapper[4974]: I1013 19:21:10.534522 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r4mpc" podStartSLOduration=2.946369315 podStartE2EDuration="8.534485978s" podCreationTimestamp="2025-10-13 19:21:02 +0000 UTC" firstStartedPulling="2025-10-13 19:21:04.442417842 +0000 UTC m=+3999.346783932" lastFinishedPulling="2025-10-13 19:21:10.030534505 +0000 UTC m=+4004.934900595" observedRunningTime="2025-10-13 19:21:10.529141688 +0000 UTC m=+4005.433507788" watchObservedRunningTime="2025-10-13 19:21:10.534485978 +0000 UTC m=+4005.438852098" Oct 13 19:21:13 crc kubenswrapper[4974]: I1013 19:21:13.106321 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:13 crc kubenswrapper[4974]: I1013 19:21:13.106736 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:13 crc kubenswrapper[4974]: I1013 19:21:13.210355 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:21 crc kubenswrapper[4974]: I1013 19:21:21.813209 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:21:21 crc kubenswrapper[4974]: E1013 19:21:21.814782 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:21:23 crc kubenswrapper[4974]: I1013 19:21:23.167850 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r4mpc" Oct 13 19:21:23 crc kubenswrapper[4974]: I1013 19:21:23.260542 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r4mpc"] Oct 13 19:21:23 crc kubenswrapper[4974]: I1013 19:21:23.292852 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-26g4k"] Oct 13 19:21:23 crc kubenswrapper[4974]: I1013 19:21:23.293164 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-26g4k" podUID="850191fb-a1bf-43b6-910c-cf2a1da233f5" containerName="registry-server" containerID="cri-o://ec21deab25ff91f6a322455846260e426a25c4ef7abbf90e8b1fbffef7aaa324" gracePeriod=2 Oct 13 19:21:23 crc kubenswrapper[4974]: I1013 19:21:23.669236 4974 generic.go:334] "Generic (PLEG): container finished" podID="850191fb-a1bf-43b6-910c-cf2a1da233f5" containerID="ec21deab25ff91f6a322455846260e426a25c4ef7abbf90e8b1fbffef7aaa324" exitCode=0 Oct 13 19:21:23 crc kubenswrapper[4974]: I1013 19:21:23.669752 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26g4k" event={"ID":"850191fb-a1bf-43b6-910c-cf2a1da233f5","Type":"ContainerDied","Data":"ec21deab25ff91f6a322455846260e426a25c4ef7abbf90e8b1fbffef7aaa324"} Oct 13 19:21:23 crc kubenswrapper[4974]: I1013 19:21:23.875786 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-26g4k" Oct 13 19:21:23 crc kubenswrapper[4974]: I1013 19:21:23.971747 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850191fb-a1bf-43b6-910c-cf2a1da233f5-utilities\") pod \"850191fb-a1bf-43b6-910c-cf2a1da233f5\" (UID: \"850191fb-a1bf-43b6-910c-cf2a1da233f5\") " Oct 13 19:21:23 crc kubenswrapper[4974]: I1013 19:21:23.972304 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m87qj\" (UniqueName: \"kubernetes.io/projected/850191fb-a1bf-43b6-910c-cf2a1da233f5-kube-api-access-m87qj\") pod \"850191fb-a1bf-43b6-910c-cf2a1da233f5\" (UID: \"850191fb-a1bf-43b6-910c-cf2a1da233f5\") " Oct 13 19:21:23 crc kubenswrapper[4974]: I1013 19:21:23.972601 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850191fb-a1bf-43b6-910c-cf2a1da233f5-catalog-content\") pod \"850191fb-a1bf-43b6-910c-cf2a1da233f5\" (UID: \"850191fb-a1bf-43b6-910c-cf2a1da233f5\") " Oct 13 19:21:23 crc kubenswrapper[4974]: I1013 19:21:23.973298 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/850191fb-a1bf-43b6-910c-cf2a1da233f5-utilities" (OuterVolumeSpecName: "utilities") pod "850191fb-a1bf-43b6-910c-cf2a1da233f5" (UID: "850191fb-a1bf-43b6-910c-cf2a1da233f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:21:23 crc kubenswrapper[4974]: I1013 19:21:23.997787 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/850191fb-a1bf-43b6-910c-cf2a1da233f5-kube-api-access-m87qj" (OuterVolumeSpecName: "kube-api-access-m87qj") pod "850191fb-a1bf-43b6-910c-cf2a1da233f5" (UID: "850191fb-a1bf-43b6-910c-cf2a1da233f5"). InnerVolumeSpecName "kube-api-access-m87qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:21:24 crc kubenswrapper[4974]: I1013 19:21:24.043213 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/850191fb-a1bf-43b6-910c-cf2a1da233f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "850191fb-a1bf-43b6-910c-cf2a1da233f5" (UID: "850191fb-a1bf-43b6-910c-cf2a1da233f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:21:24 crc kubenswrapper[4974]: I1013 19:21:24.075681 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/850191fb-a1bf-43b6-910c-cf2a1da233f5-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:21:24 crc kubenswrapper[4974]: I1013 19:21:24.075722 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m87qj\" (UniqueName: \"kubernetes.io/projected/850191fb-a1bf-43b6-910c-cf2a1da233f5-kube-api-access-m87qj\") on node \"crc\" DevicePath \"\"" Oct 13 19:21:24 crc kubenswrapper[4974]: I1013 19:21:24.075734 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/850191fb-a1bf-43b6-910c-cf2a1da233f5-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:21:24 crc kubenswrapper[4974]: I1013 19:21:24.683829 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-26g4k" event={"ID":"850191fb-a1bf-43b6-910c-cf2a1da233f5","Type":"ContainerDied","Data":"0aa8f3325d0475410e826cb5be709669268d8d1933b7532fdeff34fefd66cd8c"} Oct 13 19:21:24 crc kubenswrapper[4974]: I1013 19:21:24.684106 4974 scope.go:117] "RemoveContainer" containerID="ec21deab25ff91f6a322455846260e426a25c4ef7abbf90e8b1fbffef7aaa324" Oct 13 19:21:24 crc kubenswrapper[4974]: I1013 19:21:24.683938 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-26g4k" Oct 13 19:21:24 crc kubenswrapper[4974]: I1013 19:21:24.720609 4974 scope.go:117] "RemoveContainer" containerID="b053562002e9ac192eec734232bc07504d2c0702ac82c0fa5f1343730471ae17" Oct 13 19:21:24 crc kubenswrapper[4974]: I1013 19:21:24.729463 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-26g4k"] Oct 13 19:21:24 crc kubenswrapper[4974]: I1013 19:21:24.740544 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-26g4k"] Oct 13 19:21:24 crc kubenswrapper[4974]: I1013 19:21:24.748901 4974 scope.go:117] "RemoveContainer" containerID="095b4b702996fc4958f47e5cfbbae5da390f3b83e341cf4428004a04eb3827bd" Oct 13 19:21:25 crc kubenswrapper[4974]: I1013 19:21:25.829779 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="850191fb-a1bf-43b6-910c-cf2a1da233f5" path="/var/lib/kubelet/pods/850191fb-a1bf-43b6-910c-cf2a1da233f5/volumes" Oct 13 19:21:34 crc kubenswrapper[4974]: I1013 19:21:34.812141 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:21:34 crc kubenswrapper[4974]: E1013 19:21:34.812970 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:21:49 crc kubenswrapper[4974]: I1013 19:21:49.812275 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:21:49 crc kubenswrapper[4974]: E1013 19:21:49.813378 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:22:04 crc kubenswrapper[4974]: I1013 19:22:04.811239 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:22:04 crc kubenswrapper[4974]: E1013 19:22:04.812414 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:22:19 crc kubenswrapper[4974]: I1013 19:22:19.812255 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:22:19 crc kubenswrapper[4974]: E1013 19:22:19.813384 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:22:34 crc kubenswrapper[4974]: I1013 19:22:34.812255 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:22:34 crc kubenswrapper[4974]: E1013 19:22:34.813909 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:22:46 crc kubenswrapper[4974]: I1013 19:22:46.818024 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:22:46 crc kubenswrapper[4974]: E1013 19:22:46.818979 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:22:57 crc kubenswrapper[4974]: I1013 19:22:57.812888 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:22:57 crc kubenswrapper[4974]: E1013 19:22:57.814190 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:23:12 crc kubenswrapper[4974]: I1013 19:23:12.812459 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:23:12 crc kubenswrapper[4974]: E1013 19:23:12.813705 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:23:26 crc kubenswrapper[4974]: I1013 19:23:26.812110 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:23:26 crc kubenswrapper[4974]: E1013 19:23:26.813239 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:23:37 crc kubenswrapper[4974]: I1013 19:23:37.813193 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:23:38 crc kubenswrapper[4974]: I1013 19:23:38.336573 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"49c73ec8d4c50439efc253b3fdcd9b11cb8e08337975c5d640f814ffccb7475c"} Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.012882 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bqxch"] Oct 13 19:25:28 crc kubenswrapper[4974]: E1013 19:25:28.014717 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850191fb-a1bf-43b6-910c-cf2a1da233f5" containerName="extract-content" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.014755 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="850191fb-a1bf-43b6-910c-cf2a1da233f5" containerName="extract-content" Oct 13 19:25:28 crc kubenswrapper[4974]: E1013 19:25:28.014869 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850191fb-a1bf-43b6-910c-cf2a1da233f5" containerName="registry-server" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.014893 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="850191fb-a1bf-43b6-910c-cf2a1da233f5" containerName="registry-server" Oct 13 19:25:28 crc kubenswrapper[4974]: E1013 19:25:28.014939 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="850191fb-a1bf-43b6-910c-cf2a1da233f5" containerName="extract-utilities" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.014958 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="850191fb-a1bf-43b6-910c-cf2a1da233f5" containerName="extract-utilities" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.015572 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="850191fb-a1bf-43b6-910c-cf2a1da233f5" containerName="registry-server" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.024804 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.025885 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bqxch"] Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.174150 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4db0cc05-0a7e-4925-8775-db8d0b74929c-utilities\") pod \"community-operators-bqxch\" (UID: \"4db0cc05-0a7e-4925-8775-db8d0b74929c\") " pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.174461 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xqcm\" (UniqueName: \"kubernetes.io/projected/4db0cc05-0a7e-4925-8775-db8d0b74929c-kube-api-access-8xqcm\") pod \"community-operators-bqxch\" (UID: \"4db0cc05-0a7e-4925-8775-db8d0b74929c\") " pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.174584 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4db0cc05-0a7e-4925-8775-db8d0b74929c-catalog-content\") pod \"community-operators-bqxch\" (UID: \"4db0cc05-0a7e-4925-8775-db8d0b74929c\") " pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.276714 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4db0cc05-0a7e-4925-8775-db8d0b74929c-utilities\") pod \"community-operators-bqxch\" (UID: \"4db0cc05-0a7e-4925-8775-db8d0b74929c\") " pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.276782 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xqcm\" (UniqueName: \"kubernetes.io/projected/4db0cc05-0a7e-4925-8775-db8d0b74929c-kube-api-access-8xqcm\") pod \"community-operators-bqxch\" (UID: \"4db0cc05-0a7e-4925-8775-db8d0b74929c\") " pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.276825 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4db0cc05-0a7e-4925-8775-db8d0b74929c-catalog-content\") pod \"community-operators-bqxch\" (UID: \"4db0cc05-0a7e-4925-8775-db8d0b74929c\") " pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.277271 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4db0cc05-0a7e-4925-8775-db8d0b74929c-utilities\") pod \"community-operators-bqxch\" (UID: \"4db0cc05-0a7e-4925-8775-db8d0b74929c\") " pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.277457 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4db0cc05-0a7e-4925-8775-db8d0b74929c-catalog-content\") pod \"community-operators-bqxch\" (UID: \"4db0cc05-0a7e-4925-8775-db8d0b74929c\") " pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.299346 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xqcm\" (UniqueName: \"kubernetes.io/projected/4db0cc05-0a7e-4925-8775-db8d0b74929c-kube-api-access-8xqcm\") pod \"community-operators-bqxch\" (UID: \"4db0cc05-0a7e-4925-8775-db8d0b74929c\") " pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.364754 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:28 crc kubenswrapper[4974]: I1013 19:25:28.854954 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bqxch"] Oct 13 19:25:29 crc kubenswrapper[4974]: I1013 19:25:29.663374 4974 generic.go:334] "Generic (PLEG): container finished" podID="4db0cc05-0a7e-4925-8775-db8d0b74929c" containerID="dcc419a9fbba5ca001a70c002e498b5563810fe14e2ad77362247520d434a684" exitCode=0 Oct 13 19:25:29 crc kubenswrapper[4974]: I1013 19:25:29.663499 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqxch" event={"ID":"4db0cc05-0a7e-4925-8775-db8d0b74929c","Type":"ContainerDied","Data":"dcc419a9fbba5ca001a70c002e498b5563810fe14e2ad77362247520d434a684"} Oct 13 19:25:29 crc kubenswrapper[4974]: I1013 19:25:29.663957 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqxch" event={"ID":"4db0cc05-0a7e-4925-8775-db8d0b74929c","Type":"ContainerStarted","Data":"2f4eda224528cc7a924becb824a89a1f3c4cac80a64117704e86ab9460275af5"} Oct 13 19:25:30 crc kubenswrapper[4974]: I1013 19:25:30.677578 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqxch" event={"ID":"4db0cc05-0a7e-4925-8775-db8d0b74929c","Type":"ContainerStarted","Data":"3e1203c8581fe2820cd1eb0e38389c6953de687ac8aea017d26556ab27f9d2ce"} Oct 13 19:25:32 crc kubenswrapper[4974]: I1013 19:25:32.697166 4974 generic.go:334] "Generic (PLEG): container finished" podID="4db0cc05-0a7e-4925-8775-db8d0b74929c" containerID="3e1203c8581fe2820cd1eb0e38389c6953de687ac8aea017d26556ab27f9d2ce" exitCode=0 Oct 13 19:25:32 crc kubenswrapper[4974]: I1013 19:25:32.697223 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqxch" event={"ID":"4db0cc05-0a7e-4925-8775-db8d0b74929c","Type":"ContainerDied","Data":"3e1203c8581fe2820cd1eb0e38389c6953de687ac8aea017d26556ab27f9d2ce"} Oct 13 19:25:33 crc kubenswrapper[4974]: I1013 19:25:33.709232 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqxch" event={"ID":"4db0cc05-0a7e-4925-8775-db8d0b74929c","Type":"ContainerStarted","Data":"7b13d8eb30f3775ac479096bae99220b8b1d8bc5520af6c19c7bb78e8f0570bb"} Oct 13 19:25:33 crc kubenswrapper[4974]: I1013 19:25:33.733456 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bqxch" podStartSLOduration=3.246250401 podStartE2EDuration="6.733439508s" podCreationTimestamp="2025-10-13 19:25:27 +0000 UTC" firstStartedPulling="2025-10-13 19:25:29.673369751 +0000 UTC m=+4264.577735861" lastFinishedPulling="2025-10-13 19:25:33.160558888 +0000 UTC m=+4268.064924968" observedRunningTime="2025-10-13 19:25:33.730579788 +0000 UTC m=+4268.634945868" watchObservedRunningTime="2025-10-13 19:25:33.733439508 +0000 UTC m=+4268.637805578" Oct 13 19:25:38 crc kubenswrapper[4974]: I1013 19:25:38.365122 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:38 crc kubenswrapper[4974]: I1013 19:25:38.366112 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:38 crc kubenswrapper[4974]: I1013 19:25:38.465358 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:38 crc kubenswrapper[4974]: I1013 19:25:38.828323 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:42 crc kubenswrapper[4974]: I1013 19:25:42.597885 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bqxch"] Oct 13 19:25:42 crc kubenswrapper[4974]: I1013 19:25:42.598683 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bqxch" podUID="4db0cc05-0a7e-4925-8775-db8d0b74929c" containerName="registry-server" containerID="cri-o://7b13d8eb30f3775ac479096bae99220b8b1d8bc5520af6c19c7bb78e8f0570bb" gracePeriod=2 Oct 13 19:25:42 crc kubenswrapper[4974]: I1013 19:25:42.825014 4974 generic.go:334] "Generic (PLEG): container finished" podID="4db0cc05-0a7e-4925-8775-db8d0b74929c" containerID="7b13d8eb30f3775ac479096bae99220b8b1d8bc5520af6c19c7bb78e8f0570bb" exitCode=0 Oct 13 19:25:42 crc kubenswrapper[4974]: I1013 19:25:42.825100 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqxch" event={"ID":"4db0cc05-0a7e-4925-8775-db8d0b74929c","Type":"ContainerDied","Data":"7b13d8eb30f3775ac479096bae99220b8b1d8bc5520af6c19c7bb78e8f0570bb"} Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.220355 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.347563 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4db0cc05-0a7e-4925-8775-db8d0b74929c-utilities\") pod \"4db0cc05-0a7e-4925-8775-db8d0b74929c\" (UID: \"4db0cc05-0a7e-4925-8775-db8d0b74929c\") " Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.348161 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4db0cc05-0a7e-4925-8775-db8d0b74929c-catalog-content\") pod \"4db0cc05-0a7e-4925-8775-db8d0b74929c\" (UID: \"4db0cc05-0a7e-4925-8775-db8d0b74929c\") " Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.348238 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xqcm\" (UniqueName: \"kubernetes.io/projected/4db0cc05-0a7e-4925-8775-db8d0b74929c-kube-api-access-8xqcm\") pod \"4db0cc05-0a7e-4925-8775-db8d0b74929c\" (UID: \"4db0cc05-0a7e-4925-8775-db8d0b74929c\") " Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.349076 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4db0cc05-0a7e-4925-8775-db8d0b74929c-utilities" (OuterVolumeSpecName: "utilities") pod "4db0cc05-0a7e-4925-8775-db8d0b74929c" (UID: "4db0cc05-0a7e-4925-8775-db8d0b74929c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.370618 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4db0cc05-0a7e-4925-8775-db8d0b74929c-kube-api-access-8xqcm" (OuterVolumeSpecName: "kube-api-access-8xqcm") pod "4db0cc05-0a7e-4925-8775-db8d0b74929c" (UID: "4db0cc05-0a7e-4925-8775-db8d0b74929c"). InnerVolumeSpecName "kube-api-access-8xqcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.407163 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4db0cc05-0a7e-4925-8775-db8d0b74929c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4db0cc05-0a7e-4925-8775-db8d0b74929c" (UID: "4db0cc05-0a7e-4925-8775-db8d0b74929c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.450413 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xqcm\" (UniqueName: \"kubernetes.io/projected/4db0cc05-0a7e-4925-8775-db8d0b74929c-kube-api-access-8xqcm\") on node \"crc\" DevicePath \"\"" Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.450603 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4db0cc05-0a7e-4925-8775-db8d0b74929c-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.450688 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4db0cc05-0a7e-4925-8775-db8d0b74929c-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.852533 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bqxch" event={"ID":"4db0cc05-0a7e-4925-8775-db8d0b74929c","Type":"ContainerDied","Data":"2f4eda224528cc7a924becb824a89a1f3c4cac80a64117704e86ab9460275af5"} Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.852613 4974 scope.go:117] "RemoveContainer" containerID="7b13d8eb30f3775ac479096bae99220b8b1d8bc5520af6c19c7bb78e8f0570bb" Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.852709 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bqxch" Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.889086 4974 scope.go:117] "RemoveContainer" containerID="3e1203c8581fe2820cd1eb0e38389c6953de687ac8aea017d26556ab27f9d2ce" Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.897360 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bqxch"] Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.912729 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bqxch"] Oct 13 19:25:43 crc kubenswrapper[4974]: I1013 19:25:43.921467 4974 scope.go:117] "RemoveContainer" containerID="dcc419a9fbba5ca001a70c002e498b5563810fe14e2ad77362247520d434a684" Oct 13 19:25:45 crc kubenswrapper[4974]: I1013 19:25:45.827415 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4db0cc05-0a7e-4925-8775-db8d0b74929c" path="/var/lib/kubelet/pods/4db0cc05-0a7e-4925-8775-db8d0b74929c/volumes" Oct 13 19:26:07 crc kubenswrapper[4974]: I1013 19:26:07.742636 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:26:07 crc kubenswrapper[4974]: I1013 19:26:07.743233 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:26:37 crc kubenswrapper[4974]: I1013 19:26:37.743469 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:26:37 crc kubenswrapper[4974]: I1013 19:26:37.744150 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.268746 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7hjlq"] Oct 13 19:26:49 crc kubenswrapper[4974]: E1013 19:26:49.269563 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4db0cc05-0a7e-4925-8775-db8d0b74929c" containerName="extract-utilities" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.269576 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4db0cc05-0a7e-4925-8775-db8d0b74929c" containerName="extract-utilities" Oct 13 19:26:49 crc kubenswrapper[4974]: E1013 19:26:49.269609 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4db0cc05-0a7e-4925-8775-db8d0b74929c" containerName="registry-server" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.269690 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4db0cc05-0a7e-4925-8775-db8d0b74929c" containerName="registry-server" Oct 13 19:26:49 crc kubenswrapper[4974]: E1013 19:26:49.269699 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4db0cc05-0a7e-4925-8775-db8d0b74929c" containerName="extract-content" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.269705 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="4db0cc05-0a7e-4925-8775-db8d0b74929c" containerName="extract-content" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.269895 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="4db0cc05-0a7e-4925-8775-db8d0b74929c" containerName="registry-server" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.271563 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.293645 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hjlq"] Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.412383 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184ec4aa-3ec1-481f-afa4-3242355bd0a3-utilities\") pod \"redhat-marketplace-7hjlq\" (UID: \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\") " pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.412452 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184ec4aa-3ec1-481f-afa4-3242355bd0a3-catalog-content\") pod \"redhat-marketplace-7hjlq\" (UID: \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\") " pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.412594 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scmjs\" (UniqueName: \"kubernetes.io/projected/184ec4aa-3ec1-481f-afa4-3242355bd0a3-kube-api-access-scmjs\") pod \"redhat-marketplace-7hjlq\" (UID: \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\") " pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.514750 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184ec4aa-3ec1-481f-afa4-3242355bd0a3-utilities\") pod \"redhat-marketplace-7hjlq\" (UID: \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\") " pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.514822 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184ec4aa-3ec1-481f-afa4-3242355bd0a3-catalog-content\") pod \"redhat-marketplace-7hjlq\" (UID: \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\") " pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.514931 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scmjs\" (UniqueName: \"kubernetes.io/projected/184ec4aa-3ec1-481f-afa4-3242355bd0a3-kube-api-access-scmjs\") pod \"redhat-marketplace-7hjlq\" (UID: \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\") " pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.515888 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184ec4aa-3ec1-481f-afa4-3242355bd0a3-utilities\") pod \"redhat-marketplace-7hjlq\" (UID: \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\") " pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.515945 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184ec4aa-3ec1-481f-afa4-3242355bd0a3-catalog-content\") pod \"redhat-marketplace-7hjlq\" (UID: \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\") " pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.538290 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scmjs\" (UniqueName: \"kubernetes.io/projected/184ec4aa-3ec1-481f-afa4-3242355bd0a3-kube-api-access-scmjs\") pod \"redhat-marketplace-7hjlq\" (UID: \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\") " pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:49 crc kubenswrapper[4974]: I1013 19:26:49.592560 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:50 crc kubenswrapper[4974]: I1013 19:26:50.123443 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hjlq"] Oct 13 19:26:50 crc kubenswrapper[4974]: W1013 19:26:50.131896 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod184ec4aa_3ec1_481f_afa4_3242355bd0a3.slice/crio-8fabdfea4e400dcadacdcefca84ab9dbbccdb72eb1d810c7fbaa5e59509092b7 WatchSource:0}: Error finding container 8fabdfea4e400dcadacdcefca84ab9dbbccdb72eb1d810c7fbaa5e59509092b7: Status 404 returned error can't find the container with id 8fabdfea4e400dcadacdcefca84ab9dbbccdb72eb1d810c7fbaa5e59509092b7 Oct 13 19:26:50 crc kubenswrapper[4974]: I1013 19:26:50.700093 4974 generic.go:334] "Generic (PLEG): container finished" podID="184ec4aa-3ec1-481f-afa4-3242355bd0a3" containerID="be2023da40569eb5e5a2b146bdcf7215cc4583a33f516bfbcd45aae651892ae4" exitCode=0 Oct 13 19:26:50 crc kubenswrapper[4974]: I1013 19:26:50.700319 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hjlq" event={"ID":"184ec4aa-3ec1-481f-afa4-3242355bd0a3","Type":"ContainerDied","Data":"be2023da40569eb5e5a2b146bdcf7215cc4583a33f516bfbcd45aae651892ae4"} Oct 13 19:26:50 crc kubenswrapper[4974]: I1013 19:26:50.700415 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hjlq" event={"ID":"184ec4aa-3ec1-481f-afa4-3242355bd0a3","Type":"ContainerStarted","Data":"8fabdfea4e400dcadacdcefca84ab9dbbccdb72eb1d810c7fbaa5e59509092b7"} Oct 13 19:26:50 crc kubenswrapper[4974]: I1013 19:26:50.703684 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 19:26:51 crc kubenswrapper[4974]: I1013 19:26:51.711118 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hjlq" event={"ID":"184ec4aa-3ec1-481f-afa4-3242355bd0a3","Type":"ContainerStarted","Data":"1686dc417bd6602b869c5870f06d07c6d97e765a6412da176009c41455debab7"} Oct 13 19:26:52 crc kubenswrapper[4974]: I1013 19:26:52.723080 4974 generic.go:334] "Generic (PLEG): container finished" podID="184ec4aa-3ec1-481f-afa4-3242355bd0a3" containerID="1686dc417bd6602b869c5870f06d07c6d97e765a6412da176009c41455debab7" exitCode=0 Oct 13 19:26:52 crc kubenswrapper[4974]: I1013 19:26:52.723188 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hjlq" event={"ID":"184ec4aa-3ec1-481f-afa4-3242355bd0a3","Type":"ContainerDied","Data":"1686dc417bd6602b869c5870f06d07c6d97e765a6412da176009c41455debab7"} Oct 13 19:26:53 crc kubenswrapper[4974]: I1013 19:26:53.737701 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hjlq" event={"ID":"184ec4aa-3ec1-481f-afa4-3242355bd0a3","Type":"ContainerStarted","Data":"c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f"} Oct 13 19:26:53 crc kubenswrapper[4974]: I1013 19:26:53.759689 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7hjlq" podStartSLOduration=2.264860536 podStartE2EDuration="4.759667392s" podCreationTimestamp="2025-10-13 19:26:49 +0000 UTC" firstStartedPulling="2025-10-13 19:26:50.703369367 +0000 UTC m=+4345.607735447" lastFinishedPulling="2025-10-13 19:26:53.198176223 +0000 UTC m=+4348.102542303" observedRunningTime="2025-10-13 19:26:53.755080833 +0000 UTC m=+4348.659446963" watchObservedRunningTime="2025-10-13 19:26:53.759667392 +0000 UTC m=+4348.664033472" Oct 13 19:26:59 crc kubenswrapper[4974]: I1013 19:26:59.593463 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:59 crc kubenswrapper[4974]: I1013 19:26:59.595645 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:59 crc kubenswrapper[4974]: I1013 19:26:59.645026 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:59 crc kubenswrapper[4974]: I1013 19:26:59.873471 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:26:59 crc kubenswrapper[4974]: I1013 19:26:59.937372 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hjlq"] Oct 13 19:27:01 crc kubenswrapper[4974]: I1013 19:27:01.826434 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7hjlq" podUID="184ec4aa-3ec1-481f-afa4-3242355bd0a3" containerName="registry-server" containerID="cri-o://c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f" gracePeriod=2 Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.373766 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.389192 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-scmjs\" (UniqueName: \"kubernetes.io/projected/184ec4aa-3ec1-481f-afa4-3242355bd0a3-kube-api-access-scmjs\") pod \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\" (UID: \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\") " Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.389312 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184ec4aa-3ec1-481f-afa4-3242355bd0a3-catalog-content\") pod \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\" (UID: \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\") " Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.389336 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184ec4aa-3ec1-481f-afa4-3242355bd0a3-utilities\") pod \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\" (UID: \"184ec4aa-3ec1-481f-afa4-3242355bd0a3\") " Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.390563 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/184ec4aa-3ec1-481f-afa4-3242355bd0a3-utilities" (OuterVolumeSpecName: "utilities") pod "184ec4aa-3ec1-481f-afa4-3242355bd0a3" (UID: "184ec4aa-3ec1-481f-afa4-3242355bd0a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.401716 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/184ec4aa-3ec1-481f-afa4-3242355bd0a3-kube-api-access-scmjs" (OuterVolumeSpecName: "kube-api-access-scmjs") pod "184ec4aa-3ec1-481f-afa4-3242355bd0a3" (UID: "184ec4aa-3ec1-481f-afa4-3242355bd0a3"). InnerVolumeSpecName "kube-api-access-scmjs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.412160 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/184ec4aa-3ec1-481f-afa4-3242355bd0a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "184ec4aa-3ec1-481f-afa4-3242355bd0a3" (UID: "184ec4aa-3ec1-481f-afa4-3242355bd0a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.492024 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-scmjs\" (UniqueName: \"kubernetes.io/projected/184ec4aa-3ec1-481f-afa4-3242355bd0a3-kube-api-access-scmjs\") on node \"crc\" DevicePath \"\"" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.492059 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/184ec4aa-3ec1-481f-afa4-3242355bd0a3-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.492072 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/184ec4aa-3ec1-481f-afa4-3242355bd0a3-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.844090 4974 generic.go:334] "Generic (PLEG): container finished" podID="184ec4aa-3ec1-481f-afa4-3242355bd0a3" containerID="c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f" exitCode=0 Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.844141 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hjlq" event={"ID":"184ec4aa-3ec1-481f-afa4-3242355bd0a3","Type":"ContainerDied","Data":"c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f"} Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.844171 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7hjlq" event={"ID":"184ec4aa-3ec1-481f-afa4-3242355bd0a3","Type":"ContainerDied","Data":"8fabdfea4e400dcadacdcefca84ab9dbbccdb72eb1d810c7fbaa5e59509092b7"} Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.844181 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7hjlq" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.844192 4974 scope.go:117] "RemoveContainer" containerID="c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.868854 4974 scope.go:117] "RemoveContainer" containerID="1686dc417bd6602b869c5870f06d07c6d97e765a6412da176009c41455debab7" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.904363 4974 scope.go:117] "RemoveContainer" containerID="be2023da40569eb5e5a2b146bdcf7215cc4583a33f516bfbcd45aae651892ae4" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.911141 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hjlq"] Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.921784 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7hjlq"] Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.976537 4974 scope.go:117] "RemoveContainer" containerID="c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f" Oct 13 19:27:02 crc kubenswrapper[4974]: E1013 19:27:02.977266 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f\": container with ID starting with c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f not found: ID does not exist" containerID="c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.977305 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f"} err="failed to get container status \"c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f\": rpc error: code = NotFound desc = could not find container \"c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f\": container with ID starting with c22ae0709177c6f806dbe042bc45802ac519d168214e44b6611ab3323c8c272f not found: ID does not exist" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.977333 4974 scope.go:117] "RemoveContainer" containerID="1686dc417bd6602b869c5870f06d07c6d97e765a6412da176009c41455debab7" Oct 13 19:27:02 crc kubenswrapper[4974]: E1013 19:27:02.978026 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1686dc417bd6602b869c5870f06d07c6d97e765a6412da176009c41455debab7\": container with ID starting with 1686dc417bd6602b869c5870f06d07c6d97e765a6412da176009c41455debab7 not found: ID does not exist" containerID="1686dc417bd6602b869c5870f06d07c6d97e765a6412da176009c41455debab7" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.978166 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1686dc417bd6602b869c5870f06d07c6d97e765a6412da176009c41455debab7"} err="failed to get container status \"1686dc417bd6602b869c5870f06d07c6d97e765a6412da176009c41455debab7\": rpc error: code = NotFound desc = could not find container \"1686dc417bd6602b869c5870f06d07c6d97e765a6412da176009c41455debab7\": container with ID starting with 1686dc417bd6602b869c5870f06d07c6d97e765a6412da176009c41455debab7 not found: ID does not exist" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.978272 4974 scope.go:117] "RemoveContainer" containerID="be2023da40569eb5e5a2b146bdcf7215cc4583a33f516bfbcd45aae651892ae4" Oct 13 19:27:02 crc kubenswrapper[4974]: E1013 19:27:02.978744 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be2023da40569eb5e5a2b146bdcf7215cc4583a33f516bfbcd45aae651892ae4\": container with ID starting with be2023da40569eb5e5a2b146bdcf7215cc4583a33f516bfbcd45aae651892ae4 not found: ID does not exist" containerID="be2023da40569eb5e5a2b146bdcf7215cc4583a33f516bfbcd45aae651892ae4" Oct 13 19:27:02 crc kubenswrapper[4974]: I1013 19:27:02.978792 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be2023da40569eb5e5a2b146bdcf7215cc4583a33f516bfbcd45aae651892ae4"} err="failed to get container status \"be2023da40569eb5e5a2b146bdcf7215cc4583a33f516bfbcd45aae651892ae4\": rpc error: code = NotFound desc = could not find container \"be2023da40569eb5e5a2b146bdcf7215cc4583a33f516bfbcd45aae651892ae4\": container with ID starting with be2023da40569eb5e5a2b146bdcf7215cc4583a33f516bfbcd45aae651892ae4 not found: ID does not exist" Oct 13 19:27:03 crc kubenswrapper[4974]: I1013 19:27:03.824636 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="184ec4aa-3ec1-481f-afa4-3242355bd0a3" path="/var/lib/kubelet/pods/184ec4aa-3ec1-481f-afa4-3242355bd0a3/volumes" Oct 13 19:27:07 crc kubenswrapper[4974]: I1013 19:27:07.743146 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:27:07 crc kubenswrapper[4974]: I1013 19:27:07.743233 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:27:07 crc kubenswrapper[4974]: I1013 19:27:07.743314 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 19:27:07 crc kubenswrapper[4974]: I1013 19:27:07.744419 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"49c73ec8d4c50439efc253b3fdcd9b11cb8e08337975c5d640f814ffccb7475c"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 19:27:07 crc kubenswrapper[4974]: I1013 19:27:07.744527 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://49c73ec8d4c50439efc253b3fdcd9b11cb8e08337975c5d640f814ffccb7475c" gracePeriod=600 Oct 13 19:27:08 crc kubenswrapper[4974]: I1013 19:27:08.922927 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="49c73ec8d4c50439efc253b3fdcd9b11cb8e08337975c5d640f814ffccb7475c" exitCode=0 Oct 13 19:27:08 crc kubenswrapper[4974]: I1013 19:27:08.923168 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"49c73ec8d4c50439efc253b3fdcd9b11cb8e08337975c5d640f814ffccb7475c"} Oct 13 19:27:08 crc kubenswrapper[4974]: I1013 19:27:08.923622 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d"} Oct 13 19:27:08 crc kubenswrapper[4974]: I1013 19:27:08.923669 4974 scope.go:117] "RemoveContainer" containerID="b9884fad3166cef61c9252cfa4f3f4120717a76a181364f13de00ba9ce9e4237" Oct 13 19:27:28 crc kubenswrapper[4974]: I1013 19:27:28.891345 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b7l76"] Oct 13 19:27:28 crc kubenswrapper[4974]: E1013 19:27:28.892668 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184ec4aa-3ec1-481f-afa4-3242355bd0a3" containerName="extract-content" Oct 13 19:27:28 crc kubenswrapper[4974]: I1013 19:27:28.892686 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="184ec4aa-3ec1-481f-afa4-3242355bd0a3" containerName="extract-content" Oct 13 19:27:28 crc kubenswrapper[4974]: E1013 19:27:28.892718 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184ec4aa-3ec1-481f-afa4-3242355bd0a3" containerName="registry-server" Oct 13 19:27:28 crc kubenswrapper[4974]: I1013 19:27:28.892726 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="184ec4aa-3ec1-481f-afa4-3242355bd0a3" containerName="registry-server" Oct 13 19:27:28 crc kubenswrapper[4974]: E1013 19:27:28.892771 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="184ec4aa-3ec1-481f-afa4-3242355bd0a3" containerName="extract-utilities" Oct 13 19:27:28 crc kubenswrapper[4974]: I1013 19:27:28.892779 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="184ec4aa-3ec1-481f-afa4-3242355bd0a3" containerName="extract-utilities" Oct 13 19:27:28 crc kubenswrapper[4974]: I1013 19:27:28.892988 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="184ec4aa-3ec1-481f-afa4-3242355bd0a3" containerName="registry-server" Oct 13 19:27:28 crc kubenswrapper[4974]: I1013 19:27:28.894772 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:28 crc kubenswrapper[4974]: I1013 19:27:28.914539 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b7l76"] Oct 13 19:27:29 crc kubenswrapper[4974]: I1013 19:27:29.041435 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-catalog-content\") pod \"redhat-operators-b7l76\" (UID: \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\") " pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:29 crc kubenswrapper[4974]: I1013 19:27:29.042760 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm7s4\" (UniqueName: \"kubernetes.io/projected/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-kube-api-access-hm7s4\") pod \"redhat-operators-b7l76\" (UID: \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\") " pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:29 crc kubenswrapper[4974]: I1013 19:27:29.042916 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-utilities\") pod \"redhat-operators-b7l76\" (UID: \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\") " pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:29 crc kubenswrapper[4974]: I1013 19:27:29.144446 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm7s4\" (UniqueName: \"kubernetes.io/projected/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-kube-api-access-hm7s4\") pod \"redhat-operators-b7l76\" (UID: \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\") " pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:29 crc kubenswrapper[4974]: I1013 19:27:29.144564 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-utilities\") pod \"redhat-operators-b7l76\" (UID: \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\") " pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:29 crc kubenswrapper[4974]: I1013 19:27:29.144608 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-catalog-content\") pod \"redhat-operators-b7l76\" (UID: \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\") " pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:29 crc kubenswrapper[4974]: I1013 19:27:29.145125 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-utilities\") pod \"redhat-operators-b7l76\" (UID: \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\") " pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:29 crc kubenswrapper[4974]: I1013 19:27:29.145399 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-catalog-content\") pod \"redhat-operators-b7l76\" (UID: \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\") " pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:29 crc kubenswrapper[4974]: I1013 19:27:29.169382 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm7s4\" (UniqueName: \"kubernetes.io/projected/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-kube-api-access-hm7s4\") pod \"redhat-operators-b7l76\" (UID: \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\") " pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:29 crc kubenswrapper[4974]: I1013 19:27:29.244359 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:29 crc kubenswrapper[4974]: I1013 19:27:29.766458 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b7l76"] Oct 13 19:27:29 crc kubenswrapper[4974]: W1013 19:27:29.769177 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57850489_3c96_4a7a_bc70_9cfb1a3fb11b.slice/crio-282045ec19eec99062023d488bc89a107697094f27fb880b4a0c55b19e090746 WatchSource:0}: Error finding container 282045ec19eec99062023d488bc89a107697094f27fb880b4a0c55b19e090746: Status 404 returned error can't find the container with id 282045ec19eec99062023d488bc89a107697094f27fb880b4a0c55b19e090746 Oct 13 19:27:30 crc kubenswrapper[4974]: I1013 19:27:30.229606 4974 generic.go:334] "Generic (PLEG): container finished" podID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" containerID="d9df98f6de6d2602ae06b75dd217a927f43210af141c334965165c421868ff9e" exitCode=0 Oct 13 19:27:30 crc kubenswrapper[4974]: I1013 19:27:30.229714 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7l76" event={"ID":"57850489-3c96-4a7a-bc70-9cfb1a3fb11b","Type":"ContainerDied","Data":"d9df98f6de6d2602ae06b75dd217a927f43210af141c334965165c421868ff9e"} Oct 13 19:27:30 crc kubenswrapper[4974]: I1013 19:27:30.229978 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7l76" event={"ID":"57850489-3c96-4a7a-bc70-9cfb1a3fb11b","Type":"ContainerStarted","Data":"282045ec19eec99062023d488bc89a107697094f27fb880b4a0c55b19e090746"} Oct 13 19:27:32 crc kubenswrapper[4974]: I1013 19:27:32.251951 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7l76" event={"ID":"57850489-3c96-4a7a-bc70-9cfb1a3fb11b","Type":"ContainerStarted","Data":"bd3d8ae967064a6888c5d10f855cb9bdce9dcc7e7667b9964fdcac87c1b8f333"} Oct 13 19:27:35 crc kubenswrapper[4974]: I1013 19:27:35.284535 4974 generic.go:334] "Generic (PLEG): container finished" podID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" containerID="bd3d8ae967064a6888c5d10f855cb9bdce9dcc7e7667b9964fdcac87c1b8f333" exitCode=0 Oct 13 19:27:35 crc kubenswrapper[4974]: I1013 19:27:35.284641 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7l76" event={"ID":"57850489-3c96-4a7a-bc70-9cfb1a3fb11b","Type":"ContainerDied","Data":"bd3d8ae967064a6888c5d10f855cb9bdce9dcc7e7667b9964fdcac87c1b8f333"} Oct 13 19:27:37 crc kubenswrapper[4974]: I1013 19:27:37.313212 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7l76" event={"ID":"57850489-3c96-4a7a-bc70-9cfb1a3fb11b","Type":"ContainerStarted","Data":"2607acbc39b8ab79cbcc5a4f32aa8fd16468a6c6b9da794fc44685f17d41859e"} Oct 13 19:27:37 crc kubenswrapper[4974]: I1013 19:27:37.359407 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b7l76" podStartSLOduration=3.834000564 podStartE2EDuration="9.35937629s" podCreationTimestamp="2025-10-13 19:27:28 +0000 UTC" firstStartedPulling="2025-10-13 19:27:30.231857254 +0000 UTC m=+4385.136223334" lastFinishedPulling="2025-10-13 19:27:35.75723294 +0000 UTC m=+4390.661599060" observedRunningTime="2025-10-13 19:27:37.342529516 +0000 UTC m=+4392.246895626" watchObservedRunningTime="2025-10-13 19:27:37.35937629 +0000 UTC m=+4392.263742410" Oct 13 19:27:39 crc kubenswrapper[4974]: I1013 19:27:39.245937 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:39 crc kubenswrapper[4974]: I1013 19:27:39.246448 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:40 crc kubenswrapper[4974]: I1013 19:27:40.297089 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b7l76" podUID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" containerName="registry-server" probeResult="failure" output=< Oct 13 19:27:40 crc kubenswrapper[4974]: timeout: failed to connect service ":50051" within 1s Oct 13 19:27:40 crc kubenswrapper[4974]: > Oct 13 19:27:49 crc kubenswrapper[4974]: I1013 19:27:49.852511 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:49 crc kubenswrapper[4974]: I1013 19:27:49.918436 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:50 crc kubenswrapper[4974]: I1013 19:27:50.092340 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b7l76"] Oct 13 19:27:51 crc kubenswrapper[4974]: I1013 19:27:51.453646 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b7l76" podUID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" containerName="registry-server" containerID="cri-o://2607acbc39b8ab79cbcc5a4f32aa8fd16468a6c6b9da794fc44685f17d41859e" gracePeriod=2 Oct 13 19:27:52 crc kubenswrapper[4974]: I1013 19:27:52.475433 4974 generic.go:334] "Generic (PLEG): container finished" podID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" containerID="2607acbc39b8ab79cbcc5a4f32aa8fd16468a6c6b9da794fc44685f17d41859e" exitCode=0 Oct 13 19:27:52 crc kubenswrapper[4974]: I1013 19:27:52.475505 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7l76" event={"ID":"57850489-3c96-4a7a-bc70-9cfb1a3fb11b","Type":"ContainerDied","Data":"2607acbc39b8ab79cbcc5a4f32aa8fd16468a6c6b9da794fc44685f17d41859e"} Oct 13 19:27:52 crc kubenswrapper[4974]: I1013 19:27:52.650064 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:52 crc kubenswrapper[4974]: I1013 19:27:52.819911 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm7s4\" (UniqueName: \"kubernetes.io/projected/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-kube-api-access-hm7s4\") pod \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\" (UID: \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\") " Oct 13 19:27:52 crc kubenswrapper[4974]: I1013 19:27:52.820143 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-utilities\") pod \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\" (UID: \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\") " Oct 13 19:27:52 crc kubenswrapper[4974]: I1013 19:27:52.820358 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-catalog-content\") pod \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\" (UID: \"57850489-3c96-4a7a-bc70-9cfb1a3fb11b\") " Oct 13 19:27:52 crc kubenswrapper[4974]: I1013 19:27:52.820965 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-utilities" (OuterVolumeSpecName: "utilities") pod "57850489-3c96-4a7a-bc70-9cfb1a3fb11b" (UID: "57850489-3c96-4a7a-bc70-9cfb1a3fb11b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:27:52 crc kubenswrapper[4974]: I1013 19:27:52.821135 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:27:52 crc kubenswrapper[4974]: I1013 19:27:52.829375 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-kube-api-access-hm7s4" (OuterVolumeSpecName: "kube-api-access-hm7s4") pod "57850489-3c96-4a7a-bc70-9cfb1a3fb11b" (UID: "57850489-3c96-4a7a-bc70-9cfb1a3fb11b"). InnerVolumeSpecName "kube-api-access-hm7s4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:27:52 crc kubenswrapper[4974]: I1013 19:27:52.922977 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm7s4\" (UniqueName: \"kubernetes.io/projected/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-kube-api-access-hm7s4\") on node \"crc\" DevicePath \"\"" Oct 13 19:27:52 crc kubenswrapper[4974]: I1013 19:27:52.936862 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57850489-3c96-4a7a-bc70-9cfb1a3fb11b" (UID: "57850489-3c96-4a7a-bc70-9cfb1a3fb11b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:27:53 crc kubenswrapper[4974]: I1013 19:27:53.025068 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57850489-3c96-4a7a-bc70-9cfb1a3fb11b-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:27:53 crc kubenswrapper[4974]: I1013 19:27:53.493980 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7l76" event={"ID":"57850489-3c96-4a7a-bc70-9cfb1a3fb11b","Type":"ContainerDied","Data":"282045ec19eec99062023d488bc89a107697094f27fb880b4a0c55b19e090746"} Oct 13 19:27:53 crc kubenswrapper[4974]: I1013 19:27:53.494059 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7l76" Oct 13 19:27:53 crc kubenswrapper[4974]: I1013 19:27:53.494492 4974 scope.go:117] "RemoveContainer" containerID="2607acbc39b8ab79cbcc5a4f32aa8fd16468a6c6b9da794fc44685f17d41859e" Oct 13 19:27:53 crc kubenswrapper[4974]: I1013 19:27:53.525053 4974 scope.go:117] "RemoveContainer" containerID="bd3d8ae967064a6888c5d10f855cb9bdce9dcc7e7667b9964fdcac87c1b8f333" Oct 13 19:27:53 crc kubenswrapper[4974]: I1013 19:27:53.542864 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b7l76"] Oct 13 19:27:53 crc kubenswrapper[4974]: I1013 19:27:53.559459 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b7l76"] Oct 13 19:27:53 crc kubenswrapper[4974]: I1013 19:27:53.583565 4974 scope.go:117] "RemoveContainer" containerID="d9df98f6de6d2602ae06b75dd217a927f43210af141c334965165c421868ff9e" Oct 13 19:27:53 crc kubenswrapper[4974]: I1013 19:27:53.835065 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" path="/var/lib/kubelet/pods/57850489-3c96-4a7a-bc70-9cfb1a3fb11b/volumes" Oct 13 19:29:37 crc kubenswrapper[4974]: I1013 19:29:37.742597 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:29:37 crc kubenswrapper[4974]: I1013 19:29:37.743167 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.144709 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx"] Oct 13 19:30:00 crc kubenswrapper[4974]: E1013 19:30:00.145487 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" containerName="extract-utilities" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.145501 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" containerName="extract-utilities" Oct 13 19:30:00 crc kubenswrapper[4974]: E1013 19:30:00.145511 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" containerName="extract-content" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.145517 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" containerName="extract-content" Oct 13 19:30:00 crc kubenswrapper[4974]: E1013 19:30:00.145554 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" containerName="registry-server" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.145560 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" containerName="registry-server" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.145801 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="57850489-3c96-4a7a-bc70-9cfb1a3fb11b" containerName="registry-server" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.146453 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.149330 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.149625 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.166684 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx"] Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.306468 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4phs\" (UniqueName: \"kubernetes.io/projected/df772f2b-f688-47d9-91e3-103cf7afaecf-kube-api-access-v4phs\") pod \"collect-profiles-29339730-wnwsx\" (UID: \"df772f2b-f688-47d9-91e3-103cf7afaecf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.306557 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df772f2b-f688-47d9-91e3-103cf7afaecf-secret-volume\") pod \"collect-profiles-29339730-wnwsx\" (UID: \"df772f2b-f688-47d9-91e3-103cf7afaecf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.306596 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df772f2b-f688-47d9-91e3-103cf7afaecf-config-volume\") pod \"collect-profiles-29339730-wnwsx\" (UID: \"df772f2b-f688-47d9-91e3-103cf7afaecf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.409265 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df772f2b-f688-47d9-91e3-103cf7afaecf-config-volume\") pod \"collect-profiles-29339730-wnwsx\" (UID: \"df772f2b-f688-47d9-91e3-103cf7afaecf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.409514 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4phs\" (UniqueName: \"kubernetes.io/projected/df772f2b-f688-47d9-91e3-103cf7afaecf-kube-api-access-v4phs\") pod \"collect-profiles-29339730-wnwsx\" (UID: \"df772f2b-f688-47d9-91e3-103cf7afaecf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.409592 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df772f2b-f688-47d9-91e3-103cf7afaecf-secret-volume\") pod \"collect-profiles-29339730-wnwsx\" (UID: \"df772f2b-f688-47d9-91e3-103cf7afaecf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:00 crc kubenswrapper[4974]: I1013 19:30:00.410932 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df772f2b-f688-47d9-91e3-103cf7afaecf-config-volume\") pod \"collect-profiles-29339730-wnwsx\" (UID: \"df772f2b-f688-47d9-91e3-103cf7afaecf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:01 crc kubenswrapper[4974]: I1013 19:30:01.000840 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df772f2b-f688-47d9-91e3-103cf7afaecf-secret-volume\") pod \"collect-profiles-29339730-wnwsx\" (UID: \"df772f2b-f688-47d9-91e3-103cf7afaecf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:01 crc kubenswrapper[4974]: I1013 19:30:01.002356 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4phs\" (UniqueName: \"kubernetes.io/projected/df772f2b-f688-47d9-91e3-103cf7afaecf-kube-api-access-v4phs\") pod \"collect-profiles-29339730-wnwsx\" (UID: \"df772f2b-f688-47d9-91e3-103cf7afaecf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:01 crc kubenswrapper[4974]: I1013 19:30:01.119032 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:01 crc kubenswrapper[4974]: I1013 19:30:01.673826 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx"] Oct 13 19:30:02 crc kubenswrapper[4974]: I1013 19:30:02.046398 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" event={"ID":"df772f2b-f688-47d9-91e3-103cf7afaecf","Type":"ContainerStarted","Data":"b620df50b4e6082e963c4edfc9127f534c8fb529f81d86477f5bcb3d1b4300ce"} Oct 13 19:30:02 crc kubenswrapper[4974]: I1013 19:30:02.046767 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" event={"ID":"df772f2b-f688-47d9-91e3-103cf7afaecf","Type":"ContainerStarted","Data":"0b39814af4e9439665814b631ad6f39e4dc96c8a0c1c27c977ca78471381f39d"} Oct 13 19:30:02 crc kubenswrapper[4974]: I1013 19:30:02.073415 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" podStartSLOduration=2.073399343 podStartE2EDuration="2.073399343s" podCreationTimestamp="2025-10-13 19:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 19:30:02.06549411 +0000 UTC m=+4536.969860190" watchObservedRunningTime="2025-10-13 19:30:02.073399343 +0000 UTC m=+4536.977765423" Oct 13 19:30:03 crc kubenswrapper[4974]: I1013 19:30:03.059746 4974 generic.go:334] "Generic (PLEG): container finished" podID="df772f2b-f688-47d9-91e3-103cf7afaecf" containerID="b620df50b4e6082e963c4edfc9127f534c8fb529f81d86477f5bcb3d1b4300ce" exitCode=0 Oct 13 19:30:03 crc kubenswrapper[4974]: I1013 19:30:03.059797 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" event={"ID":"df772f2b-f688-47d9-91e3-103cf7afaecf","Type":"ContainerDied","Data":"b620df50b4e6082e963c4edfc9127f534c8fb529f81d86477f5bcb3d1b4300ce"} Oct 13 19:30:04 crc kubenswrapper[4974]: I1013 19:30:04.461151 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:04 crc kubenswrapper[4974]: I1013 19:30:04.499395 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df772f2b-f688-47d9-91e3-103cf7afaecf-config-volume\") pod \"df772f2b-f688-47d9-91e3-103cf7afaecf\" (UID: \"df772f2b-f688-47d9-91e3-103cf7afaecf\") " Oct 13 19:30:04 crc kubenswrapper[4974]: I1013 19:30:04.499563 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df772f2b-f688-47d9-91e3-103cf7afaecf-secret-volume\") pod \"df772f2b-f688-47d9-91e3-103cf7afaecf\" (UID: \"df772f2b-f688-47d9-91e3-103cf7afaecf\") " Oct 13 19:30:04 crc kubenswrapper[4974]: I1013 19:30:04.499755 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4phs\" (UniqueName: \"kubernetes.io/projected/df772f2b-f688-47d9-91e3-103cf7afaecf-kube-api-access-v4phs\") pod \"df772f2b-f688-47d9-91e3-103cf7afaecf\" (UID: \"df772f2b-f688-47d9-91e3-103cf7afaecf\") " Oct 13 19:30:04 crc kubenswrapper[4974]: I1013 19:30:04.500121 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df772f2b-f688-47d9-91e3-103cf7afaecf-config-volume" (OuterVolumeSpecName: "config-volume") pod "df772f2b-f688-47d9-91e3-103cf7afaecf" (UID: "df772f2b-f688-47d9-91e3-103cf7afaecf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 19:30:04 crc kubenswrapper[4974]: I1013 19:30:04.500364 4974 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df772f2b-f688-47d9-91e3-103cf7afaecf-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 19:30:04 crc kubenswrapper[4974]: I1013 19:30:04.598721 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df772f2b-f688-47d9-91e3-103cf7afaecf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "df772f2b-f688-47d9-91e3-103cf7afaecf" (UID: "df772f2b-f688-47d9-91e3-103cf7afaecf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:30:04 crc kubenswrapper[4974]: I1013 19:30:04.598886 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df772f2b-f688-47d9-91e3-103cf7afaecf-kube-api-access-v4phs" (OuterVolumeSpecName: "kube-api-access-v4phs") pod "df772f2b-f688-47d9-91e3-103cf7afaecf" (UID: "df772f2b-f688-47d9-91e3-103cf7afaecf"). InnerVolumeSpecName "kube-api-access-v4phs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:30:04 crc kubenswrapper[4974]: I1013 19:30:04.602217 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4phs\" (UniqueName: \"kubernetes.io/projected/df772f2b-f688-47d9-91e3-103cf7afaecf-kube-api-access-v4phs\") on node \"crc\" DevicePath \"\"" Oct 13 19:30:04 crc kubenswrapper[4974]: I1013 19:30:04.602255 4974 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df772f2b-f688-47d9-91e3-103cf7afaecf-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 19:30:04 crc kubenswrapper[4974]: I1013 19:30:04.737219 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8"] Oct 13 19:30:04 crc kubenswrapper[4974]: I1013 19:30:04.745739 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339685-dqhp8"] Oct 13 19:30:05 crc kubenswrapper[4974]: I1013 19:30:05.088728 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" event={"ID":"df772f2b-f688-47d9-91e3-103cf7afaecf","Type":"ContainerDied","Data":"0b39814af4e9439665814b631ad6f39e4dc96c8a0c1c27c977ca78471381f39d"} Oct 13 19:30:05 crc kubenswrapper[4974]: I1013 19:30:05.088772 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b39814af4e9439665814b631ad6f39e4dc96c8a0c1c27c977ca78471381f39d" Oct 13 19:30:05 crc kubenswrapper[4974]: I1013 19:30:05.089040 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339730-wnwsx" Oct 13 19:30:05 crc kubenswrapper[4974]: I1013 19:30:05.866722 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd" path="/var/lib/kubelet/pods/bcbab4dc-a478-4f7a-b6f9-60fd81fcbabd/volumes" Oct 13 19:30:07 crc kubenswrapper[4974]: I1013 19:30:07.743272 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:30:07 crc kubenswrapper[4974]: I1013 19:30:07.743901 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:30:37 crc kubenswrapper[4974]: I1013 19:30:37.743390 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:30:37 crc kubenswrapper[4974]: I1013 19:30:37.744000 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:30:37 crc kubenswrapper[4974]: I1013 19:30:37.744053 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 19:30:37 crc kubenswrapper[4974]: I1013 19:30:37.744772 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 19:30:37 crc kubenswrapper[4974]: I1013 19:30:37.744838 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" gracePeriod=600 Oct 13 19:30:37 crc kubenswrapper[4974]: E1013 19:30:37.877855 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:30:38 crc kubenswrapper[4974]: I1013 19:30:38.481055 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" exitCode=0 Oct 13 19:30:38 crc kubenswrapper[4974]: I1013 19:30:38.481126 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d"} Oct 13 19:30:38 crc kubenswrapper[4974]: I1013 19:30:38.481374 4974 scope.go:117] "RemoveContainer" containerID="49c73ec8d4c50439efc253b3fdcd9b11cb8e08337975c5d640f814ffccb7475c" Oct 13 19:30:38 crc kubenswrapper[4974]: I1013 19:30:38.482055 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:30:38 crc kubenswrapper[4974]: E1013 19:30:38.482286 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:30:49 crc kubenswrapper[4974]: I1013 19:30:49.816903 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:30:49 crc kubenswrapper[4974]: E1013 19:30:49.817642 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:30:51 crc kubenswrapper[4974]: I1013 19:30:51.299389 4974 scope.go:117] "RemoveContainer" containerID="0054e29819a35a39b42907db5712d1dddb819715f8daaceabf91bbd4f5a2e07c" Oct 13 19:31:01 crc kubenswrapper[4974]: I1013 19:31:01.812400 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:31:01 crc kubenswrapper[4974]: E1013 19:31:01.813364 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:31:13 crc kubenswrapper[4974]: I1013 19:31:13.811952 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:31:13 crc kubenswrapper[4974]: E1013 19:31:13.812938 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:31:24 crc kubenswrapper[4974]: I1013 19:31:24.812145 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:31:24 crc kubenswrapper[4974]: E1013 19:31:24.813585 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:31:35 crc kubenswrapper[4974]: I1013 19:31:35.824777 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:31:35 crc kubenswrapper[4974]: E1013 19:31:35.826027 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.459401 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rcj9z"] Oct 13 19:31:41 crc kubenswrapper[4974]: E1013 19:31:41.460757 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df772f2b-f688-47d9-91e3-103cf7afaecf" containerName="collect-profiles" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.460778 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="df772f2b-f688-47d9-91e3-103cf7afaecf" containerName="collect-profiles" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.469934 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="df772f2b-f688-47d9-91e3-103cf7afaecf" containerName="collect-profiles" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.518161 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.518051 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rcj9z"] Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.626367 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09769371-2036-4973-a736-f9c02878ded1-utilities\") pod \"certified-operators-rcj9z\" (UID: \"09769371-2036-4973-a736-f9c02878ded1\") " pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.626952 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09769371-2036-4973-a736-f9c02878ded1-catalog-content\") pod \"certified-operators-rcj9z\" (UID: \"09769371-2036-4973-a736-f9c02878ded1\") " pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.627149 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnpwb\" (UniqueName: \"kubernetes.io/projected/09769371-2036-4973-a736-f9c02878ded1-kube-api-access-nnpwb\") pod \"certified-operators-rcj9z\" (UID: \"09769371-2036-4973-a736-f9c02878ded1\") " pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.729157 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09769371-2036-4973-a736-f9c02878ded1-utilities\") pod \"certified-operators-rcj9z\" (UID: \"09769371-2036-4973-a736-f9c02878ded1\") " pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.729333 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09769371-2036-4973-a736-f9c02878ded1-catalog-content\") pod \"certified-operators-rcj9z\" (UID: \"09769371-2036-4973-a736-f9c02878ded1\") " pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.729406 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnpwb\" (UniqueName: \"kubernetes.io/projected/09769371-2036-4973-a736-f9c02878ded1-kube-api-access-nnpwb\") pod \"certified-operators-rcj9z\" (UID: \"09769371-2036-4973-a736-f9c02878ded1\") " pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.729894 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09769371-2036-4973-a736-f9c02878ded1-catalog-content\") pod \"certified-operators-rcj9z\" (UID: \"09769371-2036-4973-a736-f9c02878ded1\") " pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.730049 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09769371-2036-4973-a736-f9c02878ded1-utilities\") pod \"certified-operators-rcj9z\" (UID: \"09769371-2036-4973-a736-f9c02878ded1\") " pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.761607 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnpwb\" (UniqueName: \"kubernetes.io/projected/09769371-2036-4973-a736-f9c02878ded1-kube-api-access-nnpwb\") pod \"certified-operators-rcj9z\" (UID: \"09769371-2036-4973-a736-f9c02878ded1\") " pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:41 crc kubenswrapper[4974]: I1013 19:31:41.853914 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:42 crc kubenswrapper[4974]: I1013 19:31:42.428885 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rcj9z"] Oct 13 19:31:43 crc kubenswrapper[4974]: I1013 19:31:43.264834 4974 generic.go:334] "Generic (PLEG): container finished" podID="09769371-2036-4973-a736-f9c02878ded1" containerID="fba61e30d4cf25d83272bac98078906870de488c4824115dc6f3e2b734174dc6" exitCode=0 Oct 13 19:31:43 crc kubenswrapper[4974]: I1013 19:31:43.264912 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcj9z" event={"ID":"09769371-2036-4973-a736-f9c02878ded1","Type":"ContainerDied","Data":"fba61e30d4cf25d83272bac98078906870de488c4824115dc6f3e2b734174dc6"} Oct 13 19:31:43 crc kubenswrapper[4974]: I1013 19:31:43.265425 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcj9z" event={"ID":"09769371-2036-4973-a736-f9c02878ded1","Type":"ContainerStarted","Data":"ee1911984292673777669011feeae502ea00fa26ff0b6b831694e43bed5aabe5"} Oct 13 19:31:44 crc kubenswrapper[4974]: I1013 19:31:44.278210 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcj9z" event={"ID":"09769371-2036-4973-a736-f9c02878ded1","Type":"ContainerStarted","Data":"3dd62c504ea645f5cda9ea06b74ea205720ee13c800ecefed6121454af06a4d0"} Oct 13 19:31:45 crc kubenswrapper[4974]: I1013 19:31:45.302242 4974 generic.go:334] "Generic (PLEG): container finished" podID="09769371-2036-4973-a736-f9c02878ded1" containerID="3dd62c504ea645f5cda9ea06b74ea205720ee13c800ecefed6121454af06a4d0" exitCode=0 Oct 13 19:31:45 crc kubenswrapper[4974]: I1013 19:31:45.302388 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcj9z" event={"ID":"09769371-2036-4973-a736-f9c02878ded1","Type":"ContainerDied","Data":"3dd62c504ea645f5cda9ea06b74ea205720ee13c800ecefed6121454af06a4d0"} Oct 13 19:31:46 crc kubenswrapper[4974]: I1013 19:31:46.312407 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcj9z" event={"ID":"09769371-2036-4973-a736-f9c02878ded1","Type":"ContainerStarted","Data":"befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663"} Oct 13 19:31:46 crc kubenswrapper[4974]: I1013 19:31:46.333350 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rcj9z" podStartSLOduration=2.8622764480000003 podStartE2EDuration="5.333335314s" podCreationTimestamp="2025-10-13 19:31:41 +0000 UTC" firstStartedPulling="2025-10-13 19:31:43.267930823 +0000 UTC m=+4638.172296913" lastFinishedPulling="2025-10-13 19:31:45.738989699 +0000 UTC m=+4640.643355779" observedRunningTime="2025-10-13 19:31:46.332670425 +0000 UTC m=+4641.237036505" watchObservedRunningTime="2025-10-13 19:31:46.333335314 +0000 UTC m=+4641.237701384" Oct 13 19:31:49 crc kubenswrapper[4974]: I1013 19:31:49.812346 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:31:49 crc kubenswrapper[4974]: E1013 19:31:49.813171 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:31:51 crc kubenswrapper[4974]: I1013 19:31:51.855127 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:51 crc kubenswrapper[4974]: I1013 19:31:51.855685 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:51 crc kubenswrapper[4974]: I1013 19:31:51.906986 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:52 crc kubenswrapper[4974]: I1013 19:31:52.453423 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:52 crc kubenswrapper[4974]: I1013 19:31:52.509614 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rcj9z"] Oct 13 19:31:54 crc kubenswrapper[4974]: I1013 19:31:54.413834 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rcj9z" podUID="09769371-2036-4973-a736-f9c02878ded1" containerName="registry-server" containerID="cri-o://befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663" gracePeriod=2 Oct 13 19:31:54 crc kubenswrapper[4974]: I1013 19:31:54.992023 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.073247 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09769371-2036-4973-a736-f9c02878ded1-catalog-content\") pod \"09769371-2036-4973-a736-f9c02878ded1\" (UID: \"09769371-2036-4973-a736-f9c02878ded1\") " Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.073378 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09769371-2036-4973-a736-f9c02878ded1-utilities\") pod \"09769371-2036-4973-a736-f9c02878ded1\" (UID: \"09769371-2036-4973-a736-f9c02878ded1\") " Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.073506 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nnpwb\" (UniqueName: \"kubernetes.io/projected/09769371-2036-4973-a736-f9c02878ded1-kube-api-access-nnpwb\") pod \"09769371-2036-4973-a736-f9c02878ded1\" (UID: \"09769371-2036-4973-a736-f9c02878ded1\") " Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.075239 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09769371-2036-4973-a736-f9c02878ded1-utilities" (OuterVolumeSpecName: "utilities") pod "09769371-2036-4973-a736-f9c02878ded1" (UID: "09769371-2036-4973-a736-f9c02878ded1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.079157 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09769371-2036-4973-a736-f9c02878ded1-kube-api-access-nnpwb" (OuterVolumeSpecName: "kube-api-access-nnpwb") pod "09769371-2036-4973-a736-f9c02878ded1" (UID: "09769371-2036-4973-a736-f9c02878ded1"). InnerVolumeSpecName "kube-api-access-nnpwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.128060 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09769371-2036-4973-a736-f9c02878ded1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09769371-2036-4973-a736-f9c02878ded1" (UID: "09769371-2036-4973-a736-f9c02878ded1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.175854 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nnpwb\" (UniqueName: \"kubernetes.io/projected/09769371-2036-4973-a736-f9c02878ded1-kube-api-access-nnpwb\") on node \"crc\" DevicePath \"\"" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.175890 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09769371-2036-4973-a736-f9c02878ded1-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.175902 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09769371-2036-4973-a736-f9c02878ded1-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.442038 4974 generic.go:334] "Generic (PLEG): container finished" podID="09769371-2036-4973-a736-f9c02878ded1" containerID="befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663" exitCode=0 Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.442088 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcj9z" event={"ID":"09769371-2036-4973-a736-f9c02878ded1","Type":"ContainerDied","Data":"befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663"} Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.442154 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rcj9z" event={"ID":"09769371-2036-4973-a736-f9c02878ded1","Type":"ContainerDied","Data":"ee1911984292673777669011feeae502ea00fa26ff0b6b831694e43bed5aabe5"} Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.442174 4974 scope.go:117] "RemoveContainer" containerID="befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.442182 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rcj9z" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.499438 4974 scope.go:117] "RemoveContainer" containerID="3dd62c504ea645f5cda9ea06b74ea205720ee13c800ecefed6121454af06a4d0" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.502944 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rcj9z"] Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.513908 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rcj9z"] Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.536246 4974 scope.go:117] "RemoveContainer" containerID="fba61e30d4cf25d83272bac98078906870de488c4824115dc6f3e2b734174dc6" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.629868 4974 scope.go:117] "RemoveContainer" containerID="befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663" Oct 13 19:31:55 crc kubenswrapper[4974]: E1013 19:31:55.631322 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663\": container with ID starting with befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663 not found: ID does not exist" containerID="befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.631393 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663"} err="failed to get container status \"befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663\": rpc error: code = NotFound desc = could not find container \"befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663\": container with ID starting with befddcd17b276849080f59c6ef42a18382853ec794524848552fdac8d1b21663 not found: ID does not exist" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.631438 4974 scope.go:117] "RemoveContainer" containerID="3dd62c504ea645f5cda9ea06b74ea205720ee13c800ecefed6121454af06a4d0" Oct 13 19:31:55 crc kubenswrapper[4974]: E1013 19:31:55.631896 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3dd62c504ea645f5cda9ea06b74ea205720ee13c800ecefed6121454af06a4d0\": container with ID starting with 3dd62c504ea645f5cda9ea06b74ea205720ee13c800ecefed6121454af06a4d0 not found: ID does not exist" containerID="3dd62c504ea645f5cda9ea06b74ea205720ee13c800ecefed6121454af06a4d0" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.631943 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3dd62c504ea645f5cda9ea06b74ea205720ee13c800ecefed6121454af06a4d0"} err="failed to get container status \"3dd62c504ea645f5cda9ea06b74ea205720ee13c800ecefed6121454af06a4d0\": rpc error: code = NotFound desc = could not find container \"3dd62c504ea645f5cda9ea06b74ea205720ee13c800ecefed6121454af06a4d0\": container with ID starting with 3dd62c504ea645f5cda9ea06b74ea205720ee13c800ecefed6121454af06a4d0 not found: ID does not exist" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.631980 4974 scope.go:117] "RemoveContainer" containerID="fba61e30d4cf25d83272bac98078906870de488c4824115dc6f3e2b734174dc6" Oct 13 19:31:55 crc kubenswrapper[4974]: E1013 19:31:55.632378 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fba61e30d4cf25d83272bac98078906870de488c4824115dc6f3e2b734174dc6\": container with ID starting with fba61e30d4cf25d83272bac98078906870de488c4824115dc6f3e2b734174dc6 not found: ID does not exist" containerID="fba61e30d4cf25d83272bac98078906870de488c4824115dc6f3e2b734174dc6" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.632422 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba61e30d4cf25d83272bac98078906870de488c4824115dc6f3e2b734174dc6"} err="failed to get container status \"fba61e30d4cf25d83272bac98078906870de488c4824115dc6f3e2b734174dc6\": rpc error: code = NotFound desc = could not find container \"fba61e30d4cf25d83272bac98078906870de488c4824115dc6f3e2b734174dc6\": container with ID starting with fba61e30d4cf25d83272bac98078906870de488c4824115dc6f3e2b734174dc6 not found: ID does not exist" Oct 13 19:31:55 crc kubenswrapper[4974]: I1013 19:31:55.837095 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09769371-2036-4973-a736-f9c02878ded1" path="/var/lib/kubelet/pods/09769371-2036-4973-a736-f9c02878ded1/volumes" Oct 13 19:32:03 crc kubenswrapper[4974]: I1013 19:32:03.813603 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:32:03 crc kubenswrapper[4974]: E1013 19:32:03.817100 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:32:18 crc kubenswrapper[4974]: I1013 19:32:18.811445 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:32:18 crc kubenswrapper[4974]: E1013 19:32:18.812209 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:32:33 crc kubenswrapper[4974]: I1013 19:32:33.811862 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:32:33 crc kubenswrapper[4974]: E1013 19:32:33.812868 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:32:48 crc kubenswrapper[4974]: I1013 19:32:48.812065 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:32:48 crc kubenswrapper[4974]: E1013 19:32:48.812880 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:33:03 crc kubenswrapper[4974]: I1013 19:33:03.812054 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:33:03 crc kubenswrapper[4974]: E1013 19:33:03.812910 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:33:15 crc kubenswrapper[4974]: I1013 19:33:15.820988 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:33:15 crc kubenswrapper[4974]: E1013 19:33:15.822070 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:33:29 crc kubenswrapper[4974]: I1013 19:33:29.812220 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:33:29 crc kubenswrapper[4974]: E1013 19:33:29.813274 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:33:42 crc kubenswrapper[4974]: I1013 19:33:42.814159 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:33:42 crc kubenswrapper[4974]: E1013 19:33:42.815529 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:33:55 crc kubenswrapper[4974]: I1013 19:33:55.828937 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:33:55 crc kubenswrapper[4974]: E1013 19:33:55.830087 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:34:09 crc kubenswrapper[4974]: I1013 19:34:09.811523 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:34:09 crc kubenswrapper[4974]: E1013 19:34:09.812617 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:34:24 crc kubenswrapper[4974]: I1013 19:34:24.811362 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:34:24 crc kubenswrapper[4974]: E1013 19:34:24.812223 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:34:38 crc kubenswrapper[4974]: I1013 19:34:38.811820 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:34:38 crc kubenswrapper[4974]: E1013 19:34:38.812823 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:34:49 crc kubenswrapper[4974]: I1013 19:34:49.811622 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:34:49 crc kubenswrapper[4974]: E1013 19:34:49.812481 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:35:04 crc kubenswrapper[4974]: I1013 19:35:04.811441 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:35:04 crc kubenswrapper[4974]: E1013 19:35:04.812116 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:35:19 crc kubenswrapper[4974]: I1013 19:35:19.812545 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:35:19 crc kubenswrapper[4974]: E1013 19:35:19.813287 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:35:24 crc kubenswrapper[4974]: E1013 19:35:24.831168 4974 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.30:36620->38.102.83.30:44205: write tcp 38.102.83.30:36620->38.102.83.30:44205: write: broken pipe Oct 13 19:35:26 crc kubenswrapper[4974]: E1013 19:35:26.810485 4974 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.30:36666->38.102.83.30:44205: write tcp 38.102.83.30:36666->38.102.83.30:44205: write: broken pipe Oct 13 19:35:30 crc kubenswrapper[4974]: I1013 19:35:30.812296 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:35:30 crc kubenswrapper[4974]: E1013 19:35:30.813371 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:35:42 crc kubenswrapper[4974]: I1013 19:35:42.811971 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:35:43 crc kubenswrapper[4974]: I1013 19:35:43.164809 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"e5cb269b79af5d7941f13fd37362de02a8097d81ebc145023dbe7d25bbf89e19"} Oct 13 19:36:23 crc kubenswrapper[4974]: I1013 19:36:23.921797 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8qkfq"] Oct 13 19:36:23 crc kubenswrapper[4974]: E1013 19:36:23.923029 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09769371-2036-4973-a736-f9c02878ded1" containerName="extract-utilities" Oct 13 19:36:23 crc kubenswrapper[4974]: I1013 19:36:23.923049 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="09769371-2036-4973-a736-f9c02878ded1" containerName="extract-utilities" Oct 13 19:36:23 crc kubenswrapper[4974]: E1013 19:36:23.923077 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09769371-2036-4973-a736-f9c02878ded1" containerName="extract-content" Oct 13 19:36:23 crc kubenswrapper[4974]: I1013 19:36:23.923085 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="09769371-2036-4973-a736-f9c02878ded1" containerName="extract-content" Oct 13 19:36:23 crc kubenswrapper[4974]: E1013 19:36:23.923097 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09769371-2036-4973-a736-f9c02878ded1" containerName="registry-server" Oct 13 19:36:23 crc kubenswrapper[4974]: I1013 19:36:23.923105 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="09769371-2036-4973-a736-f9c02878ded1" containerName="registry-server" Oct 13 19:36:23 crc kubenswrapper[4974]: I1013 19:36:23.923343 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="09769371-2036-4973-a736-f9c02878ded1" containerName="registry-server" Oct 13 19:36:23 crc kubenswrapper[4974]: I1013 19:36:23.925207 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:23 crc kubenswrapper[4974]: I1013 19:36:23.936534 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8qkfq"] Oct 13 19:36:24 crc kubenswrapper[4974]: I1013 19:36:24.015505 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a11fdd-424a-415b-acc7-ca475f1482d6-utilities\") pod \"community-operators-8qkfq\" (UID: \"96a11fdd-424a-415b-acc7-ca475f1482d6\") " pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:24 crc kubenswrapper[4974]: I1013 19:36:24.015593 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a11fdd-424a-415b-acc7-ca475f1482d6-catalog-content\") pod \"community-operators-8qkfq\" (UID: \"96a11fdd-424a-415b-acc7-ca475f1482d6\") " pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:24 crc kubenswrapper[4974]: I1013 19:36:24.015632 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7w5s8\" (UniqueName: \"kubernetes.io/projected/96a11fdd-424a-415b-acc7-ca475f1482d6-kube-api-access-7w5s8\") pod \"community-operators-8qkfq\" (UID: \"96a11fdd-424a-415b-acc7-ca475f1482d6\") " pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:24 crc kubenswrapper[4974]: I1013 19:36:24.117374 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a11fdd-424a-415b-acc7-ca475f1482d6-utilities\") pod \"community-operators-8qkfq\" (UID: \"96a11fdd-424a-415b-acc7-ca475f1482d6\") " pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:24 crc kubenswrapper[4974]: I1013 19:36:24.117463 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a11fdd-424a-415b-acc7-ca475f1482d6-catalog-content\") pod \"community-operators-8qkfq\" (UID: \"96a11fdd-424a-415b-acc7-ca475f1482d6\") " pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:24 crc kubenswrapper[4974]: I1013 19:36:24.117502 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7w5s8\" (UniqueName: \"kubernetes.io/projected/96a11fdd-424a-415b-acc7-ca475f1482d6-kube-api-access-7w5s8\") pod \"community-operators-8qkfq\" (UID: \"96a11fdd-424a-415b-acc7-ca475f1482d6\") " pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:24 crc kubenswrapper[4974]: I1013 19:36:24.117917 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a11fdd-424a-415b-acc7-ca475f1482d6-utilities\") pod \"community-operators-8qkfq\" (UID: \"96a11fdd-424a-415b-acc7-ca475f1482d6\") " pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:24 crc kubenswrapper[4974]: I1013 19:36:24.118066 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a11fdd-424a-415b-acc7-ca475f1482d6-catalog-content\") pod \"community-operators-8qkfq\" (UID: \"96a11fdd-424a-415b-acc7-ca475f1482d6\") " pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:24 crc kubenswrapper[4974]: I1013 19:36:24.137795 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7w5s8\" (UniqueName: \"kubernetes.io/projected/96a11fdd-424a-415b-acc7-ca475f1482d6-kube-api-access-7w5s8\") pod \"community-operators-8qkfq\" (UID: \"96a11fdd-424a-415b-acc7-ca475f1482d6\") " pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:24 crc kubenswrapper[4974]: I1013 19:36:24.256090 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:24 crc kubenswrapper[4974]: I1013 19:36:24.815464 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8qkfq"] Oct 13 19:36:25 crc kubenswrapper[4974]: I1013 19:36:25.677111 4974 generic.go:334] "Generic (PLEG): container finished" podID="96a11fdd-424a-415b-acc7-ca475f1482d6" containerID="ab6f3d8ee41b9022a0836df778644ae6f89b401bdad347ce5b7894e784b7e01c" exitCode=0 Oct 13 19:36:25 crc kubenswrapper[4974]: I1013 19:36:25.677196 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qkfq" event={"ID":"96a11fdd-424a-415b-acc7-ca475f1482d6","Type":"ContainerDied","Data":"ab6f3d8ee41b9022a0836df778644ae6f89b401bdad347ce5b7894e784b7e01c"} Oct 13 19:36:25 crc kubenswrapper[4974]: I1013 19:36:25.677518 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qkfq" event={"ID":"96a11fdd-424a-415b-acc7-ca475f1482d6","Type":"ContainerStarted","Data":"1cb8837388362bc6f87fac8b4b824ae7c9e6695408ca638beec0d8a7d996d239"} Oct 13 19:36:25 crc kubenswrapper[4974]: I1013 19:36:25.682352 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 19:36:26 crc kubenswrapper[4974]: I1013 19:36:26.690391 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qkfq" event={"ID":"96a11fdd-424a-415b-acc7-ca475f1482d6","Type":"ContainerStarted","Data":"e0ec76de9618e1e2c25cb4431b9c7a085ffd1836e1186246143bbf368c812e18"} Oct 13 19:36:28 crc kubenswrapper[4974]: I1013 19:36:28.726092 4974 generic.go:334] "Generic (PLEG): container finished" podID="96a11fdd-424a-415b-acc7-ca475f1482d6" containerID="e0ec76de9618e1e2c25cb4431b9c7a085ffd1836e1186246143bbf368c812e18" exitCode=0 Oct 13 19:36:28 crc kubenswrapper[4974]: I1013 19:36:28.726151 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qkfq" event={"ID":"96a11fdd-424a-415b-acc7-ca475f1482d6","Type":"ContainerDied","Data":"e0ec76de9618e1e2c25cb4431b9c7a085ffd1836e1186246143bbf368c812e18"} Oct 13 19:36:29 crc kubenswrapper[4974]: I1013 19:36:29.742165 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qkfq" event={"ID":"96a11fdd-424a-415b-acc7-ca475f1482d6","Type":"ContainerStarted","Data":"be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29"} Oct 13 19:36:29 crc kubenswrapper[4974]: I1013 19:36:29.768902 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8qkfq" podStartSLOduration=3.181283497 podStartE2EDuration="6.768875729s" podCreationTimestamp="2025-10-13 19:36:23 +0000 UTC" firstStartedPulling="2025-10-13 19:36:25.681815727 +0000 UTC m=+4920.586181847" lastFinishedPulling="2025-10-13 19:36:29.269407979 +0000 UTC m=+4924.173774079" observedRunningTime="2025-10-13 19:36:29.76074138 +0000 UTC m=+4924.665107500" watchObservedRunningTime="2025-10-13 19:36:29.768875729 +0000 UTC m=+4924.673241819" Oct 13 19:36:34 crc kubenswrapper[4974]: I1013 19:36:34.256638 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:34 crc kubenswrapper[4974]: I1013 19:36:34.258553 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:35 crc kubenswrapper[4974]: I1013 19:36:35.340739 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-8qkfq" podUID="96a11fdd-424a-415b-acc7-ca475f1482d6" containerName="registry-server" probeResult="failure" output=< Oct 13 19:36:35 crc kubenswrapper[4974]: timeout: failed to connect service ":50051" within 1s Oct 13 19:36:35 crc kubenswrapper[4974]: > Oct 13 19:36:44 crc kubenswrapper[4974]: I1013 19:36:44.331809 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:44 crc kubenswrapper[4974]: I1013 19:36:44.397638 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:44 crc kubenswrapper[4974]: I1013 19:36:44.577027 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8qkfq"] Oct 13 19:36:45 crc kubenswrapper[4974]: I1013 19:36:45.930939 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8qkfq" podUID="96a11fdd-424a-415b-acc7-ca475f1482d6" containerName="registry-server" containerID="cri-o://be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29" gracePeriod=2 Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.527567 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.684451 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a11fdd-424a-415b-acc7-ca475f1482d6-catalog-content\") pod \"96a11fdd-424a-415b-acc7-ca475f1482d6\" (UID: \"96a11fdd-424a-415b-acc7-ca475f1482d6\") " Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.684640 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7w5s8\" (UniqueName: \"kubernetes.io/projected/96a11fdd-424a-415b-acc7-ca475f1482d6-kube-api-access-7w5s8\") pod \"96a11fdd-424a-415b-acc7-ca475f1482d6\" (UID: \"96a11fdd-424a-415b-acc7-ca475f1482d6\") " Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.684789 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a11fdd-424a-415b-acc7-ca475f1482d6-utilities\") pod \"96a11fdd-424a-415b-acc7-ca475f1482d6\" (UID: \"96a11fdd-424a-415b-acc7-ca475f1482d6\") " Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.686411 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a11fdd-424a-415b-acc7-ca475f1482d6-utilities" (OuterVolumeSpecName: "utilities") pod "96a11fdd-424a-415b-acc7-ca475f1482d6" (UID: "96a11fdd-424a-415b-acc7-ca475f1482d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.696940 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96a11fdd-424a-415b-acc7-ca475f1482d6-kube-api-access-7w5s8" (OuterVolumeSpecName: "kube-api-access-7w5s8") pod "96a11fdd-424a-415b-acc7-ca475f1482d6" (UID: "96a11fdd-424a-415b-acc7-ca475f1482d6"). InnerVolumeSpecName "kube-api-access-7w5s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.789531 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7w5s8\" (UniqueName: \"kubernetes.io/projected/96a11fdd-424a-415b-acc7-ca475f1482d6-kube-api-access-7w5s8\") on node \"crc\" DevicePath \"\"" Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.789582 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96a11fdd-424a-415b-acc7-ca475f1482d6-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.797065 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96a11fdd-424a-415b-acc7-ca475f1482d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96a11fdd-424a-415b-acc7-ca475f1482d6" (UID: "96a11fdd-424a-415b-acc7-ca475f1482d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.893109 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96a11fdd-424a-415b-acc7-ca475f1482d6-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.947250 4974 generic.go:334] "Generic (PLEG): container finished" podID="96a11fdd-424a-415b-acc7-ca475f1482d6" containerID="be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29" exitCode=0 Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.947291 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qkfq" event={"ID":"96a11fdd-424a-415b-acc7-ca475f1482d6","Type":"ContainerDied","Data":"be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29"} Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.947320 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8qkfq" event={"ID":"96a11fdd-424a-415b-acc7-ca475f1482d6","Type":"ContainerDied","Data":"1cb8837388362bc6f87fac8b4b824ae7c9e6695408ca638beec0d8a7d996d239"} Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.947340 4974 scope.go:117] "RemoveContainer" containerID="be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29" Oct 13 19:36:46 crc kubenswrapper[4974]: I1013 19:36:46.947469 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8qkfq" Oct 13 19:36:47 crc kubenswrapper[4974]: I1013 19:36:47.007533 4974 scope.go:117] "RemoveContainer" containerID="e0ec76de9618e1e2c25cb4431b9c7a085ffd1836e1186246143bbf368c812e18" Oct 13 19:36:47 crc kubenswrapper[4974]: I1013 19:36:47.008528 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8qkfq"] Oct 13 19:36:47 crc kubenswrapper[4974]: I1013 19:36:47.034483 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8qkfq"] Oct 13 19:36:47 crc kubenswrapper[4974]: I1013 19:36:47.048485 4974 scope.go:117] "RemoveContainer" containerID="ab6f3d8ee41b9022a0836df778644ae6f89b401bdad347ce5b7894e784b7e01c" Oct 13 19:36:47 crc kubenswrapper[4974]: I1013 19:36:47.103334 4974 scope.go:117] "RemoveContainer" containerID="be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29" Oct 13 19:36:47 crc kubenswrapper[4974]: E1013 19:36:47.103965 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29\": container with ID starting with be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29 not found: ID does not exist" containerID="be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29" Oct 13 19:36:47 crc kubenswrapper[4974]: I1013 19:36:47.104025 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29"} err="failed to get container status \"be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29\": rpc error: code = NotFound desc = could not find container \"be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29\": container with ID starting with be3df8bef85b93fa26b1fed77144f59669fc528052b694f4b52fcb112b847b29 not found: ID does not exist" Oct 13 19:36:47 crc kubenswrapper[4974]: I1013 19:36:47.104063 4974 scope.go:117] "RemoveContainer" containerID="e0ec76de9618e1e2c25cb4431b9c7a085ffd1836e1186246143bbf368c812e18" Oct 13 19:36:47 crc kubenswrapper[4974]: E1013 19:36:47.104602 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0ec76de9618e1e2c25cb4431b9c7a085ffd1836e1186246143bbf368c812e18\": container with ID starting with e0ec76de9618e1e2c25cb4431b9c7a085ffd1836e1186246143bbf368c812e18 not found: ID does not exist" containerID="e0ec76de9618e1e2c25cb4431b9c7a085ffd1836e1186246143bbf368c812e18" Oct 13 19:36:47 crc kubenswrapper[4974]: I1013 19:36:47.104631 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0ec76de9618e1e2c25cb4431b9c7a085ffd1836e1186246143bbf368c812e18"} err="failed to get container status \"e0ec76de9618e1e2c25cb4431b9c7a085ffd1836e1186246143bbf368c812e18\": rpc error: code = NotFound desc = could not find container \"e0ec76de9618e1e2c25cb4431b9c7a085ffd1836e1186246143bbf368c812e18\": container with ID starting with e0ec76de9618e1e2c25cb4431b9c7a085ffd1836e1186246143bbf368c812e18 not found: ID does not exist" Oct 13 19:36:47 crc kubenswrapper[4974]: I1013 19:36:47.104663 4974 scope.go:117] "RemoveContainer" containerID="ab6f3d8ee41b9022a0836df778644ae6f89b401bdad347ce5b7894e784b7e01c" Oct 13 19:36:47 crc kubenswrapper[4974]: E1013 19:36:47.105126 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab6f3d8ee41b9022a0836df778644ae6f89b401bdad347ce5b7894e784b7e01c\": container with ID starting with ab6f3d8ee41b9022a0836df778644ae6f89b401bdad347ce5b7894e784b7e01c not found: ID does not exist" containerID="ab6f3d8ee41b9022a0836df778644ae6f89b401bdad347ce5b7894e784b7e01c" Oct 13 19:36:47 crc kubenswrapper[4974]: I1013 19:36:47.105226 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab6f3d8ee41b9022a0836df778644ae6f89b401bdad347ce5b7894e784b7e01c"} err="failed to get container status \"ab6f3d8ee41b9022a0836df778644ae6f89b401bdad347ce5b7894e784b7e01c\": rpc error: code = NotFound desc = could not find container \"ab6f3d8ee41b9022a0836df778644ae6f89b401bdad347ce5b7894e784b7e01c\": container with ID starting with ab6f3d8ee41b9022a0836df778644ae6f89b401bdad347ce5b7894e784b7e01c not found: ID does not exist" Oct 13 19:36:47 crc kubenswrapper[4974]: I1013 19:36:47.852096 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96a11fdd-424a-415b-acc7-ca475f1482d6" path="/var/lib/kubelet/pods/96a11fdd-424a-415b-acc7-ca475f1482d6/volumes" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.409916 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c2drh"] Oct 13 19:37:47 crc kubenswrapper[4974]: E1013 19:37:47.411310 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a11fdd-424a-415b-acc7-ca475f1482d6" containerName="extract-content" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.411332 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a11fdd-424a-415b-acc7-ca475f1482d6" containerName="extract-content" Oct 13 19:37:47 crc kubenswrapper[4974]: E1013 19:37:47.411354 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a11fdd-424a-415b-acc7-ca475f1482d6" containerName="extract-utilities" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.411365 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a11fdd-424a-415b-acc7-ca475f1482d6" containerName="extract-utilities" Oct 13 19:37:47 crc kubenswrapper[4974]: E1013 19:37:47.411411 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96a11fdd-424a-415b-acc7-ca475f1482d6" containerName="registry-server" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.411422 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="96a11fdd-424a-415b-acc7-ca475f1482d6" containerName="registry-server" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.411773 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="96a11fdd-424a-415b-acc7-ca475f1482d6" containerName="registry-server" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.419182 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.455896 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c2drh"] Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.492582 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-utilities\") pod \"redhat-marketplace-c2drh\" (UID: \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\") " pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.492687 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr29h\" (UniqueName: \"kubernetes.io/projected/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-kube-api-access-rr29h\") pod \"redhat-marketplace-c2drh\" (UID: \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\") " pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.493303 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-catalog-content\") pod \"redhat-marketplace-c2drh\" (UID: \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\") " pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.595941 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-catalog-content\") pod \"redhat-marketplace-c2drh\" (UID: \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\") " pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.596053 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-utilities\") pod \"redhat-marketplace-c2drh\" (UID: \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\") " pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.596147 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr29h\" (UniqueName: \"kubernetes.io/projected/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-kube-api-access-rr29h\") pod \"redhat-marketplace-c2drh\" (UID: \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\") " pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.596415 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-catalog-content\") pod \"redhat-marketplace-c2drh\" (UID: \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\") " pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.596791 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-utilities\") pod \"redhat-marketplace-c2drh\" (UID: \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\") " pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.619313 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr29h\" (UniqueName: \"kubernetes.io/projected/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-kube-api-access-rr29h\") pod \"redhat-marketplace-c2drh\" (UID: \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\") " pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:47 crc kubenswrapper[4974]: I1013 19:37:47.768863 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:48 crc kubenswrapper[4974]: I1013 19:37:48.225070 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c2drh"] Oct 13 19:37:48 crc kubenswrapper[4974]: I1013 19:37:48.742763 4974 generic.go:334] "Generic (PLEG): container finished" podID="cd47e9fc-22b4-44af-b2f0-cf7672b73c03" containerID="346c4a7a7c7ccd23ffe698758c2282856f1166eb7983e595b2ba498f33217f13" exitCode=0 Oct 13 19:37:48 crc kubenswrapper[4974]: I1013 19:37:48.742816 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2drh" event={"ID":"cd47e9fc-22b4-44af-b2f0-cf7672b73c03","Type":"ContainerDied","Data":"346c4a7a7c7ccd23ffe698758c2282856f1166eb7983e595b2ba498f33217f13"} Oct 13 19:37:48 crc kubenswrapper[4974]: I1013 19:37:48.742841 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2drh" event={"ID":"cd47e9fc-22b4-44af-b2f0-cf7672b73c03","Type":"ContainerStarted","Data":"4adc524156696f1898dbe768d1430bc5f8a2ec7ce17525428e100c9cc18429ba"} Oct 13 19:37:49 crc kubenswrapper[4974]: I1013 19:37:49.761335 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2drh" event={"ID":"cd47e9fc-22b4-44af-b2f0-cf7672b73c03","Type":"ContainerStarted","Data":"5361139f6eeee4e265429a3c815dc3ac83bc6853d74a17dace618275585cdba5"} Oct 13 19:37:50 crc kubenswrapper[4974]: I1013 19:37:50.778544 4974 generic.go:334] "Generic (PLEG): container finished" podID="cd47e9fc-22b4-44af-b2f0-cf7672b73c03" containerID="5361139f6eeee4e265429a3c815dc3ac83bc6853d74a17dace618275585cdba5" exitCode=0 Oct 13 19:37:50 crc kubenswrapper[4974]: I1013 19:37:50.778648 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2drh" event={"ID":"cd47e9fc-22b4-44af-b2f0-cf7672b73c03","Type":"ContainerDied","Data":"5361139f6eeee4e265429a3c815dc3ac83bc6853d74a17dace618275585cdba5"} Oct 13 19:37:51 crc kubenswrapper[4974]: I1013 19:37:51.789908 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2drh" event={"ID":"cd47e9fc-22b4-44af-b2f0-cf7672b73c03","Type":"ContainerStarted","Data":"dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5"} Oct 13 19:37:51 crc kubenswrapper[4974]: I1013 19:37:51.809436 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c2drh" podStartSLOduration=2.372061228 podStartE2EDuration="4.809410641s" podCreationTimestamp="2025-10-13 19:37:47 +0000 UTC" firstStartedPulling="2025-10-13 19:37:48.744535903 +0000 UTC m=+5003.648901983" lastFinishedPulling="2025-10-13 19:37:51.181885276 +0000 UTC m=+5006.086251396" observedRunningTime="2025-10-13 19:37:51.807194949 +0000 UTC m=+5006.711561029" watchObservedRunningTime="2025-10-13 19:37:51.809410641 +0000 UTC m=+5006.713776721" Oct 13 19:37:57 crc kubenswrapper[4974]: I1013 19:37:57.769294 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:57 crc kubenswrapper[4974]: I1013 19:37:57.769799 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:57 crc kubenswrapper[4974]: I1013 19:37:57.841898 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:57 crc kubenswrapper[4974]: I1013 19:37:57.949215 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:37:58 crc kubenswrapper[4974]: I1013 19:37:58.091427 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c2drh"] Oct 13 19:37:59 crc kubenswrapper[4974]: I1013 19:37:59.893876 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c2drh" podUID="cd47e9fc-22b4-44af-b2f0-cf7672b73c03" containerName="registry-server" containerID="cri-o://dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5" gracePeriod=2 Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.390696 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.508016 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-catalog-content\") pod \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\" (UID: \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\") " Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.508347 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-utilities\") pod \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\" (UID: \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\") " Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.509511 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-utilities" (OuterVolumeSpecName: "utilities") pod "cd47e9fc-22b4-44af-b2f0-cf7672b73c03" (UID: "cd47e9fc-22b4-44af-b2f0-cf7672b73c03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.509702 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr29h\" (UniqueName: \"kubernetes.io/projected/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-kube-api-access-rr29h\") pod \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\" (UID: \"cd47e9fc-22b4-44af-b2f0-cf7672b73c03\") " Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.511126 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.518732 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-kube-api-access-rr29h" (OuterVolumeSpecName: "kube-api-access-rr29h") pod "cd47e9fc-22b4-44af-b2f0-cf7672b73c03" (UID: "cd47e9fc-22b4-44af-b2f0-cf7672b73c03"). InnerVolumeSpecName "kube-api-access-rr29h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.525352 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd47e9fc-22b4-44af-b2f0-cf7672b73c03" (UID: "cd47e9fc-22b4-44af-b2f0-cf7672b73c03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.613511 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr29h\" (UniqueName: \"kubernetes.io/projected/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-kube-api-access-rr29h\") on node \"crc\" DevicePath \"\"" Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.613562 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd47e9fc-22b4-44af-b2f0-cf7672b73c03-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.911039 4974 generic.go:334] "Generic (PLEG): container finished" podID="cd47e9fc-22b4-44af-b2f0-cf7672b73c03" containerID="dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5" exitCode=0 Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.911102 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2drh" event={"ID":"cd47e9fc-22b4-44af-b2f0-cf7672b73c03","Type":"ContainerDied","Data":"dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5"} Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.911141 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c2drh" event={"ID":"cd47e9fc-22b4-44af-b2f0-cf7672b73c03","Type":"ContainerDied","Data":"4adc524156696f1898dbe768d1430bc5f8a2ec7ce17525428e100c9cc18429ba"} Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.911153 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c2drh" Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.911181 4974 scope.go:117] "RemoveContainer" containerID="dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5" Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.949477 4974 scope.go:117] "RemoveContainer" containerID="5361139f6eeee4e265429a3c815dc3ac83bc6853d74a17dace618275585cdba5" Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.973168 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c2drh"] Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.986317 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c2drh"] Oct 13 19:38:00 crc kubenswrapper[4974]: I1013 19:38:00.993564 4974 scope.go:117] "RemoveContainer" containerID="346c4a7a7c7ccd23ffe698758c2282856f1166eb7983e595b2ba498f33217f13" Oct 13 19:38:01 crc kubenswrapper[4974]: I1013 19:38:01.034515 4974 scope.go:117] "RemoveContainer" containerID="dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5" Oct 13 19:38:01 crc kubenswrapper[4974]: E1013 19:38:01.035246 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5\": container with ID starting with dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5 not found: ID does not exist" containerID="dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5" Oct 13 19:38:01 crc kubenswrapper[4974]: I1013 19:38:01.035312 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5"} err="failed to get container status \"dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5\": rpc error: code = NotFound desc = could not find container \"dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5\": container with ID starting with dd8580fa6a0ba5250de3f5afef9bb89c6859ba9c8c66ff7c2cd16d474cd6e2a5 not found: ID does not exist" Oct 13 19:38:01 crc kubenswrapper[4974]: I1013 19:38:01.035355 4974 scope.go:117] "RemoveContainer" containerID="5361139f6eeee4e265429a3c815dc3ac83bc6853d74a17dace618275585cdba5" Oct 13 19:38:01 crc kubenswrapper[4974]: E1013 19:38:01.035902 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5361139f6eeee4e265429a3c815dc3ac83bc6853d74a17dace618275585cdba5\": container with ID starting with 5361139f6eeee4e265429a3c815dc3ac83bc6853d74a17dace618275585cdba5 not found: ID does not exist" containerID="5361139f6eeee4e265429a3c815dc3ac83bc6853d74a17dace618275585cdba5" Oct 13 19:38:01 crc kubenswrapper[4974]: I1013 19:38:01.036058 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5361139f6eeee4e265429a3c815dc3ac83bc6853d74a17dace618275585cdba5"} err="failed to get container status \"5361139f6eeee4e265429a3c815dc3ac83bc6853d74a17dace618275585cdba5\": rpc error: code = NotFound desc = could not find container \"5361139f6eeee4e265429a3c815dc3ac83bc6853d74a17dace618275585cdba5\": container with ID starting with 5361139f6eeee4e265429a3c815dc3ac83bc6853d74a17dace618275585cdba5 not found: ID does not exist" Oct 13 19:38:01 crc kubenswrapper[4974]: I1013 19:38:01.036169 4974 scope.go:117] "RemoveContainer" containerID="346c4a7a7c7ccd23ffe698758c2282856f1166eb7983e595b2ba498f33217f13" Oct 13 19:38:01 crc kubenswrapper[4974]: E1013 19:38:01.036644 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"346c4a7a7c7ccd23ffe698758c2282856f1166eb7983e595b2ba498f33217f13\": container with ID starting with 346c4a7a7c7ccd23ffe698758c2282856f1166eb7983e595b2ba498f33217f13 not found: ID does not exist" containerID="346c4a7a7c7ccd23ffe698758c2282856f1166eb7983e595b2ba498f33217f13" Oct 13 19:38:01 crc kubenswrapper[4974]: I1013 19:38:01.037142 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"346c4a7a7c7ccd23ffe698758c2282856f1166eb7983e595b2ba498f33217f13"} err="failed to get container status \"346c4a7a7c7ccd23ffe698758c2282856f1166eb7983e595b2ba498f33217f13\": rpc error: code = NotFound desc = could not find container \"346c4a7a7c7ccd23ffe698758c2282856f1166eb7983e595b2ba498f33217f13\": container with ID starting with 346c4a7a7c7ccd23ffe698758c2282856f1166eb7983e595b2ba498f33217f13 not found: ID does not exist" Oct 13 19:38:01 crc kubenswrapper[4974]: I1013 19:38:01.828280 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd47e9fc-22b4-44af-b2f0-cf7672b73c03" path="/var/lib/kubelet/pods/cd47e9fc-22b4-44af-b2f0-cf7672b73c03/volumes" Oct 13 19:38:07 crc kubenswrapper[4974]: I1013 19:38:07.743139 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:38:07 crc kubenswrapper[4974]: I1013 19:38:07.743818 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:38:37 crc kubenswrapper[4974]: I1013 19:38:37.743298 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:38:37 crc kubenswrapper[4974]: I1013 19:38:37.744084 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:39:07 crc kubenswrapper[4974]: I1013 19:39:07.743637 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:39:07 crc kubenswrapper[4974]: I1013 19:39:07.746127 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:39:07 crc kubenswrapper[4974]: I1013 19:39:07.746234 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 19:39:07 crc kubenswrapper[4974]: I1013 19:39:07.747399 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e5cb269b79af5d7941f13fd37362de02a8097d81ebc145023dbe7d25bbf89e19"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 19:39:07 crc kubenswrapper[4974]: I1013 19:39:07.747534 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://e5cb269b79af5d7941f13fd37362de02a8097d81ebc145023dbe7d25bbf89e19" gracePeriod=600 Oct 13 19:39:08 crc kubenswrapper[4974]: I1013 19:39:08.755180 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="e5cb269b79af5d7941f13fd37362de02a8097d81ebc145023dbe7d25bbf89e19" exitCode=0 Oct 13 19:39:08 crc kubenswrapper[4974]: I1013 19:39:08.755270 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"e5cb269b79af5d7941f13fd37362de02a8097d81ebc145023dbe7d25bbf89e19"} Oct 13 19:39:08 crc kubenswrapper[4974]: I1013 19:39:08.755594 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc"} Oct 13 19:39:08 crc kubenswrapper[4974]: I1013 19:39:08.755627 4974 scope.go:117] "RemoveContainer" containerID="5994862c7bacbb7815ba2e1aa2aa7e8daafb310f11d14f084a373205ba9bd41d" Oct 13 19:41:37 crc kubenswrapper[4974]: I1013 19:41:37.743186 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:41:37 crc kubenswrapper[4974]: I1013 19:41:37.743915 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.514999 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4nhgm"] Oct 13 19:41:41 crc kubenswrapper[4974]: E1013 19:41:41.516848 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd47e9fc-22b4-44af-b2f0-cf7672b73c03" containerName="extract-content" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.516885 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd47e9fc-22b4-44af-b2f0-cf7672b73c03" containerName="extract-content" Oct 13 19:41:41 crc kubenswrapper[4974]: E1013 19:41:41.516935 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd47e9fc-22b4-44af-b2f0-cf7672b73c03" containerName="registry-server" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.516953 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd47e9fc-22b4-44af-b2f0-cf7672b73c03" containerName="registry-server" Oct 13 19:41:41 crc kubenswrapper[4974]: E1013 19:41:41.517002 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd47e9fc-22b4-44af-b2f0-cf7672b73c03" containerName="extract-utilities" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.517020 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd47e9fc-22b4-44af-b2f0-cf7672b73c03" containerName="extract-utilities" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.517594 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd47e9fc-22b4-44af-b2f0-cf7672b73c03" containerName="registry-server" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.521098 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.540859 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4nhgm"] Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.690888 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlr9h\" (UniqueName: \"kubernetes.io/projected/dc896aef-13b7-4588-8e78-8734493fe6d9-kube-api-access-vlr9h\") pod \"certified-operators-4nhgm\" (UID: \"dc896aef-13b7-4588-8e78-8734493fe6d9\") " pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.691170 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc896aef-13b7-4588-8e78-8734493fe6d9-catalog-content\") pod \"certified-operators-4nhgm\" (UID: \"dc896aef-13b7-4588-8e78-8734493fe6d9\") " pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.691376 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc896aef-13b7-4588-8e78-8734493fe6d9-utilities\") pod \"certified-operators-4nhgm\" (UID: \"dc896aef-13b7-4588-8e78-8734493fe6d9\") " pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.793415 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc896aef-13b7-4588-8e78-8734493fe6d9-utilities\") pod \"certified-operators-4nhgm\" (UID: \"dc896aef-13b7-4588-8e78-8734493fe6d9\") " pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.793540 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlr9h\" (UniqueName: \"kubernetes.io/projected/dc896aef-13b7-4588-8e78-8734493fe6d9-kube-api-access-vlr9h\") pod \"certified-operators-4nhgm\" (UID: \"dc896aef-13b7-4588-8e78-8734493fe6d9\") " pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.793724 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc896aef-13b7-4588-8e78-8734493fe6d9-catalog-content\") pod \"certified-operators-4nhgm\" (UID: \"dc896aef-13b7-4588-8e78-8734493fe6d9\") " pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.794122 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc896aef-13b7-4588-8e78-8734493fe6d9-utilities\") pod \"certified-operators-4nhgm\" (UID: \"dc896aef-13b7-4588-8e78-8734493fe6d9\") " pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.794281 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc896aef-13b7-4588-8e78-8734493fe6d9-catalog-content\") pod \"certified-operators-4nhgm\" (UID: \"dc896aef-13b7-4588-8e78-8734493fe6d9\") " pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.819262 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlr9h\" (UniqueName: \"kubernetes.io/projected/dc896aef-13b7-4588-8e78-8734493fe6d9-kube-api-access-vlr9h\") pod \"certified-operators-4nhgm\" (UID: \"dc896aef-13b7-4588-8e78-8734493fe6d9\") " pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:41 crc kubenswrapper[4974]: I1013 19:41:41.850404 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:42 crc kubenswrapper[4974]: I1013 19:41:42.409071 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4nhgm"] Oct 13 19:41:42 crc kubenswrapper[4974]: I1013 19:41:42.523933 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhgm" event={"ID":"dc896aef-13b7-4588-8e78-8734493fe6d9","Type":"ContainerStarted","Data":"0ff67a51e44c8dffd24ec8ccf4df6fd2d3707c7095636040a2be214c11b3c422"} Oct 13 19:41:43 crc kubenswrapper[4974]: I1013 19:41:43.540109 4974 generic.go:334] "Generic (PLEG): container finished" podID="dc896aef-13b7-4588-8e78-8734493fe6d9" containerID="4a211e774ef126e2fda399468d4dbd5c2a0e60d788111d39f0f536a861aa9d2c" exitCode=0 Oct 13 19:41:43 crc kubenswrapper[4974]: I1013 19:41:43.540198 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhgm" event={"ID":"dc896aef-13b7-4588-8e78-8734493fe6d9","Type":"ContainerDied","Data":"4a211e774ef126e2fda399468d4dbd5c2a0e60d788111d39f0f536a861aa9d2c"} Oct 13 19:41:43 crc kubenswrapper[4974]: I1013 19:41:43.562049 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 19:41:45 crc kubenswrapper[4974]: I1013 19:41:45.570974 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhgm" event={"ID":"dc896aef-13b7-4588-8e78-8734493fe6d9","Type":"ContainerStarted","Data":"87a520e7212d3efef0260b64a0eff31d24414afad98eef021d017d851c461a51"} Oct 13 19:41:47 crc kubenswrapper[4974]: I1013 19:41:47.609639 4974 generic.go:334] "Generic (PLEG): container finished" podID="dc896aef-13b7-4588-8e78-8734493fe6d9" containerID="87a520e7212d3efef0260b64a0eff31d24414afad98eef021d017d851c461a51" exitCode=0 Oct 13 19:41:47 crc kubenswrapper[4974]: I1013 19:41:47.610807 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhgm" event={"ID":"dc896aef-13b7-4588-8e78-8734493fe6d9","Type":"ContainerDied","Data":"87a520e7212d3efef0260b64a0eff31d24414afad98eef021d017d851c461a51"} Oct 13 19:41:48 crc kubenswrapper[4974]: I1013 19:41:48.629132 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhgm" event={"ID":"dc896aef-13b7-4588-8e78-8734493fe6d9","Type":"ContainerStarted","Data":"beebf8565552ba51900380958218513dc89cfe08aaf41a6041b8061700ab9335"} Oct 13 19:41:48 crc kubenswrapper[4974]: I1013 19:41:48.677968 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4nhgm" podStartSLOduration=3.2091608369999998 podStartE2EDuration="7.677934734s" podCreationTimestamp="2025-10-13 19:41:41 +0000 UTC" firstStartedPulling="2025-10-13 19:41:43.561596009 +0000 UTC m=+5238.465962099" lastFinishedPulling="2025-10-13 19:41:48.030369906 +0000 UTC m=+5242.934735996" observedRunningTime="2025-10-13 19:41:48.655125632 +0000 UTC m=+5243.559491772" watchObservedRunningTime="2025-10-13 19:41:48.677934734 +0000 UTC m=+5243.582300854" Oct 13 19:41:51 crc kubenswrapper[4974]: I1013 19:41:51.850781 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:51 crc kubenswrapper[4974]: I1013 19:41:51.851518 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:41:51 crc kubenswrapper[4974]: I1013 19:41:51.908265 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:42:01 crc kubenswrapper[4974]: I1013 19:42:01.915499 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:42:01 crc kubenswrapper[4974]: I1013 19:42:01.967292 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4nhgm"] Oct 13 19:42:02 crc kubenswrapper[4974]: I1013 19:42:02.819081 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4nhgm" podUID="dc896aef-13b7-4588-8e78-8734493fe6d9" containerName="registry-server" containerID="cri-o://beebf8565552ba51900380958218513dc89cfe08aaf41a6041b8061700ab9335" gracePeriod=2 Oct 13 19:42:03 crc kubenswrapper[4974]: I1013 19:42:03.835163 4974 generic.go:334] "Generic (PLEG): container finished" podID="dc896aef-13b7-4588-8e78-8734493fe6d9" containerID="beebf8565552ba51900380958218513dc89cfe08aaf41a6041b8061700ab9335" exitCode=0 Oct 13 19:42:03 crc kubenswrapper[4974]: I1013 19:42:03.835298 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhgm" event={"ID":"dc896aef-13b7-4588-8e78-8734493fe6d9","Type":"ContainerDied","Data":"beebf8565552ba51900380958218513dc89cfe08aaf41a6041b8061700ab9335"} Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.025064 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.141609 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc896aef-13b7-4588-8e78-8734493fe6d9-catalog-content\") pod \"dc896aef-13b7-4588-8e78-8734493fe6d9\" (UID: \"dc896aef-13b7-4588-8e78-8734493fe6d9\") " Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.142030 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc896aef-13b7-4588-8e78-8734493fe6d9-utilities\") pod \"dc896aef-13b7-4588-8e78-8734493fe6d9\" (UID: \"dc896aef-13b7-4588-8e78-8734493fe6d9\") " Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.142087 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlr9h\" (UniqueName: \"kubernetes.io/projected/dc896aef-13b7-4588-8e78-8734493fe6d9-kube-api-access-vlr9h\") pod \"dc896aef-13b7-4588-8e78-8734493fe6d9\" (UID: \"dc896aef-13b7-4588-8e78-8734493fe6d9\") " Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.143301 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc896aef-13b7-4588-8e78-8734493fe6d9-utilities" (OuterVolumeSpecName: "utilities") pod "dc896aef-13b7-4588-8e78-8734493fe6d9" (UID: "dc896aef-13b7-4588-8e78-8734493fe6d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.149803 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc896aef-13b7-4588-8e78-8734493fe6d9-kube-api-access-vlr9h" (OuterVolumeSpecName: "kube-api-access-vlr9h") pod "dc896aef-13b7-4588-8e78-8734493fe6d9" (UID: "dc896aef-13b7-4588-8e78-8734493fe6d9"). InnerVolumeSpecName "kube-api-access-vlr9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.187938 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc896aef-13b7-4588-8e78-8734493fe6d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dc896aef-13b7-4588-8e78-8734493fe6d9" (UID: "dc896aef-13b7-4588-8e78-8734493fe6d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.250537 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc896aef-13b7-4588-8e78-8734493fe6d9-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.251074 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc896aef-13b7-4588-8e78-8734493fe6d9-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.251107 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlr9h\" (UniqueName: \"kubernetes.io/projected/dc896aef-13b7-4588-8e78-8734493fe6d9-kube-api-access-vlr9h\") on node \"crc\" DevicePath \"\"" Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.855763 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4nhgm" event={"ID":"dc896aef-13b7-4588-8e78-8734493fe6d9","Type":"ContainerDied","Data":"0ff67a51e44c8dffd24ec8ccf4df6fd2d3707c7095636040a2be214c11b3c422"} Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.855844 4974 scope.go:117] "RemoveContainer" containerID="beebf8565552ba51900380958218513dc89cfe08aaf41a6041b8061700ab9335" Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.856642 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4nhgm" Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.896947 4974 scope.go:117] "RemoveContainer" containerID="87a520e7212d3efef0260b64a0eff31d24414afad98eef021d017d851c461a51" Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.905755 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4nhgm"] Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.918144 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4nhgm"] Oct 13 19:42:04 crc kubenswrapper[4974]: I1013 19:42:04.942452 4974 scope.go:117] "RemoveContainer" containerID="4a211e774ef126e2fda399468d4dbd5c2a0e60d788111d39f0f536a861aa9d2c" Oct 13 19:42:05 crc kubenswrapper[4974]: I1013 19:42:05.832964 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc896aef-13b7-4588-8e78-8734493fe6d9" path="/var/lib/kubelet/pods/dc896aef-13b7-4588-8e78-8734493fe6d9/volumes" Oct 13 19:42:07 crc kubenswrapper[4974]: I1013 19:42:07.742988 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:42:07 crc kubenswrapper[4974]: I1013 19:42:07.743336 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.315530 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vbkz7"] Oct 13 19:42:36 crc kubenswrapper[4974]: E1013 19:42:36.316676 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc896aef-13b7-4588-8e78-8734493fe6d9" containerName="extract-content" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.316696 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc896aef-13b7-4588-8e78-8734493fe6d9" containerName="extract-content" Oct 13 19:42:36 crc kubenswrapper[4974]: E1013 19:42:36.316741 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc896aef-13b7-4588-8e78-8734493fe6d9" containerName="registry-server" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.316753 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc896aef-13b7-4588-8e78-8734493fe6d9" containerName="registry-server" Oct 13 19:42:36 crc kubenswrapper[4974]: E1013 19:42:36.316778 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc896aef-13b7-4588-8e78-8734493fe6d9" containerName="extract-utilities" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.316794 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc896aef-13b7-4588-8e78-8734493fe6d9" containerName="extract-utilities" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.317139 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc896aef-13b7-4588-8e78-8734493fe6d9" containerName="registry-server" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.320711 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.333943 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vbkz7"] Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.368237 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwr79\" (UniqueName: \"kubernetes.io/projected/b39d0e3a-aa36-4c55-8aa3-67e214236562-kube-api-access-hwr79\") pod \"redhat-operators-vbkz7\" (UID: \"b39d0e3a-aa36-4c55-8aa3-67e214236562\") " pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.368286 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39d0e3a-aa36-4c55-8aa3-67e214236562-utilities\") pod \"redhat-operators-vbkz7\" (UID: \"b39d0e3a-aa36-4c55-8aa3-67e214236562\") " pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.368446 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39d0e3a-aa36-4c55-8aa3-67e214236562-catalog-content\") pod \"redhat-operators-vbkz7\" (UID: \"b39d0e3a-aa36-4c55-8aa3-67e214236562\") " pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.470315 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39d0e3a-aa36-4c55-8aa3-67e214236562-catalog-content\") pod \"redhat-operators-vbkz7\" (UID: \"b39d0e3a-aa36-4c55-8aa3-67e214236562\") " pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.470525 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwr79\" (UniqueName: \"kubernetes.io/projected/b39d0e3a-aa36-4c55-8aa3-67e214236562-kube-api-access-hwr79\") pod \"redhat-operators-vbkz7\" (UID: \"b39d0e3a-aa36-4c55-8aa3-67e214236562\") " pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.470544 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39d0e3a-aa36-4c55-8aa3-67e214236562-utilities\") pod \"redhat-operators-vbkz7\" (UID: \"b39d0e3a-aa36-4c55-8aa3-67e214236562\") " pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.470945 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39d0e3a-aa36-4c55-8aa3-67e214236562-utilities\") pod \"redhat-operators-vbkz7\" (UID: \"b39d0e3a-aa36-4c55-8aa3-67e214236562\") " pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.471051 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39d0e3a-aa36-4c55-8aa3-67e214236562-catalog-content\") pod \"redhat-operators-vbkz7\" (UID: \"b39d0e3a-aa36-4c55-8aa3-67e214236562\") " pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.493113 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwr79\" (UniqueName: \"kubernetes.io/projected/b39d0e3a-aa36-4c55-8aa3-67e214236562-kube-api-access-hwr79\") pod \"redhat-operators-vbkz7\" (UID: \"b39d0e3a-aa36-4c55-8aa3-67e214236562\") " pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:36 crc kubenswrapper[4974]: I1013 19:42:36.674560 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:37 crc kubenswrapper[4974]: I1013 19:42:37.138563 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vbkz7"] Oct 13 19:42:37 crc kubenswrapper[4974]: I1013 19:42:37.252521 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbkz7" event={"ID":"b39d0e3a-aa36-4c55-8aa3-67e214236562","Type":"ContainerStarted","Data":"ebeacba42f2c84745d0d1ab25f072a1e52f1c21ad8cebe5649dfa35b48004898"} Oct 13 19:42:37 crc kubenswrapper[4974]: I1013 19:42:37.743319 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:42:37 crc kubenswrapper[4974]: I1013 19:42:37.744800 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:42:37 crc kubenswrapper[4974]: I1013 19:42:37.744851 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 19:42:37 crc kubenswrapper[4974]: I1013 19:42:37.745671 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 19:42:37 crc kubenswrapper[4974]: I1013 19:42:37.745720 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" gracePeriod=600 Oct 13 19:42:37 crc kubenswrapper[4974]: E1013 19:42:37.883665 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:42:38 crc kubenswrapper[4974]: I1013 19:42:38.264962 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" exitCode=0 Oct 13 19:42:38 crc kubenswrapper[4974]: I1013 19:42:38.265012 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc"} Oct 13 19:42:38 crc kubenswrapper[4974]: I1013 19:42:38.265097 4974 scope.go:117] "RemoveContainer" containerID="e5cb269b79af5d7941f13fd37362de02a8097d81ebc145023dbe7d25bbf89e19" Oct 13 19:42:38 crc kubenswrapper[4974]: I1013 19:42:38.266588 4974 generic.go:334] "Generic (PLEG): container finished" podID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerID="8ce1551359f8877fa293bbc9b3b99ef202f6949fd700c4bb270cf8da7f342e01" exitCode=0 Oct 13 19:42:38 crc kubenswrapper[4974]: I1013 19:42:38.266632 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:42:38 crc kubenswrapper[4974]: I1013 19:42:38.266727 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbkz7" event={"ID":"b39d0e3a-aa36-4c55-8aa3-67e214236562","Type":"ContainerDied","Data":"8ce1551359f8877fa293bbc9b3b99ef202f6949fd700c4bb270cf8da7f342e01"} Oct 13 19:42:38 crc kubenswrapper[4974]: E1013 19:42:38.266928 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:42:40 crc kubenswrapper[4974]: I1013 19:42:40.301129 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbkz7" event={"ID":"b39d0e3a-aa36-4c55-8aa3-67e214236562","Type":"ContainerStarted","Data":"3506f6b1dfc89c7ae694371d02ca20aa066c7171e8d948dd06e66d0fbba8f9ea"} Oct 13 19:42:44 crc kubenswrapper[4974]: I1013 19:42:44.357830 4974 generic.go:334] "Generic (PLEG): container finished" podID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerID="3506f6b1dfc89c7ae694371d02ca20aa066c7171e8d948dd06e66d0fbba8f9ea" exitCode=0 Oct 13 19:42:44 crc kubenswrapper[4974]: I1013 19:42:44.357947 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbkz7" event={"ID":"b39d0e3a-aa36-4c55-8aa3-67e214236562","Type":"ContainerDied","Data":"3506f6b1dfc89c7ae694371d02ca20aa066c7171e8d948dd06e66d0fbba8f9ea"} Oct 13 19:42:45 crc kubenswrapper[4974]: I1013 19:42:45.371291 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbkz7" event={"ID":"b39d0e3a-aa36-4c55-8aa3-67e214236562","Type":"ContainerStarted","Data":"84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6"} Oct 13 19:42:45 crc kubenswrapper[4974]: I1013 19:42:45.392476 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vbkz7" podStartSLOduration=2.766134518 podStartE2EDuration="9.392457651s" podCreationTimestamp="2025-10-13 19:42:36 +0000 UTC" firstStartedPulling="2025-10-13 19:42:38.268806318 +0000 UTC m=+5293.173172438" lastFinishedPulling="2025-10-13 19:42:44.895129481 +0000 UTC m=+5299.799495571" observedRunningTime="2025-10-13 19:42:45.38815574 +0000 UTC m=+5300.292521820" watchObservedRunningTime="2025-10-13 19:42:45.392457651 +0000 UTC m=+5300.296823731" Oct 13 19:42:46 crc kubenswrapper[4974]: I1013 19:42:46.675545 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:46 crc kubenswrapper[4974]: I1013 19:42:46.675899 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:42:47 crc kubenswrapper[4974]: I1013 19:42:47.749059 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vbkz7" podUID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerName="registry-server" probeResult="failure" output=< Oct 13 19:42:47 crc kubenswrapper[4974]: timeout: failed to connect service ":50051" within 1s Oct 13 19:42:47 crc kubenswrapper[4974]: > Oct 13 19:42:51 crc kubenswrapper[4974]: I1013 19:42:51.812636 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:42:51 crc kubenswrapper[4974]: E1013 19:42:51.813526 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:42:57 crc kubenswrapper[4974]: I1013 19:42:57.729467 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vbkz7" podUID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerName="registry-server" probeResult="failure" output=< Oct 13 19:42:57 crc kubenswrapper[4974]: timeout: failed to connect service ":50051" within 1s Oct 13 19:42:57 crc kubenswrapper[4974]: > Oct 13 19:43:02 crc kubenswrapper[4974]: I1013 19:43:02.811485 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:43:02 crc kubenswrapper[4974]: E1013 19:43:02.812167 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:43:06 crc kubenswrapper[4974]: I1013 19:43:06.764793 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:43:06 crc kubenswrapper[4974]: I1013 19:43:06.860675 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:43:07 crc kubenswrapper[4974]: I1013 19:43:07.513552 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vbkz7"] Oct 13 19:43:08 crc kubenswrapper[4974]: I1013 19:43:08.656354 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vbkz7" podUID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerName="registry-server" containerID="cri-o://84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6" gracePeriod=2 Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.207103 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.392177 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39d0e3a-aa36-4c55-8aa3-67e214236562-catalog-content\") pod \"b39d0e3a-aa36-4c55-8aa3-67e214236562\" (UID: \"b39d0e3a-aa36-4c55-8aa3-67e214236562\") " Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.393104 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwr79\" (UniqueName: \"kubernetes.io/projected/b39d0e3a-aa36-4c55-8aa3-67e214236562-kube-api-access-hwr79\") pod \"b39d0e3a-aa36-4c55-8aa3-67e214236562\" (UID: \"b39d0e3a-aa36-4c55-8aa3-67e214236562\") " Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.393343 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39d0e3a-aa36-4c55-8aa3-67e214236562-utilities\") pod \"b39d0e3a-aa36-4c55-8aa3-67e214236562\" (UID: \"b39d0e3a-aa36-4c55-8aa3-67e214236562\") " Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.394240 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39d0e3a-aa36-4c55-8aa3-67e214236562-utilities" (OuterVolumeSpecName: "utilities") pod "b39d0e3a-aa36-4c55-8aa3-67e214236562" (UID: "b39d0e3a-aa36-4c55-8aa3-67e214236562"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.406941 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39d0e3a-aa36-4c55-8aa3-67e214236562-kube-api-access-hwr79" (OuterVolumeSpecName: "kube-api-access-hwr79") pod "b39d0e3a-aa36-4c55-8aa3-67e214236562" (UID: "b39d0e3a-aa36-4c55-8aa3-67e214236562"). InnerVolumeSpecName "kube-api-access-hwr79". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.471849 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b39d0e3a-aa36-4c55-8aa3-67e214236562-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b39d0e3a-aa36-4c55-8aa3-67e214236562" (UID: "b39d0e3a-aa36-4c55-8aa3-67e214236562"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.497218 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b39d0e3a-aa36-4c55-8aa3-67e214236562-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.497263 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b39d0e3a-aa36-4c55-8aa3-67e214236562-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.497281 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwr79\" (UniqueName: \"kubernetes.io/projected/b39d0e3a-aa36-4c55-8aa3-67e214236562-kube-api-access-hwr79\") on node \"crc\" DevicePath \"\"" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.675372 4974 generic.go:334] "Generic (PLEG): container finished" podID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerID="84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6" exitCode=0 Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.675426 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbkz7" event={"ID":"b39d0e3a-aa36-4c55-8aa3-67e214236562","Type":"ContainerDied","Data":"84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6"} Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.675457 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vbkz7" event={"ID":"b39d0e3a-aa36-4c55-8aa3-67e214236562","Type":"ContainerDied","Data":"ebeacba42f2c84745d0d1ab25f072a1e52f1c21ad8cebe5649dfa35b48004898"} Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.675479 4974 scope.go:117] "RemoveContainer" containerID="84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.675633 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vbkz7" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.700371 4974 scope.go:117] "RemoveContainer" containerID="3506f6b1dfc89c7ae694371d02ca20aa066c7171e8d948dd06e66d0fbba8f9ea" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.726927 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vbkz7"] Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.739776 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vbkz7"] Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.758042 4974 scope.go:117] "RemoveContainer" containerID="8ce1551359f8877fa293bbc9b3b99ef202f6949fd700c4bb270cf8da7f342e01" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.789136 4974 scope.go:117] "RemoveContainer" containerID="84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6" Oct 13 19:43:09 crc kubenswrapper[4974]: E1013 19:43:09.789438 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6\": container with ID starting with 84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6 not found: ID does not exist" containerID="84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.789470 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6"} err="failed to get container status \"84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6\": rpc error: code = NotFound desc = could not find container \"84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6\": container with ID starting with 84bef9e38d8d337f0f70f0023f7066861c8d39bbce3a1dfc40bdad32c4e3d9a6 not found: ID does not exist" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.789488 4974 scope.go:117] "RemoveContainer" containerID="3506f6b1dfc89c7ae694371d02ca20aa066c7171e8d948dd06e66d0fbba8f9ea" Oct 13 19:43:09 crc kubenswrapper[4974]: E1013 19:43:09.789874 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3506f6b1dfc89c7ae694371d02ca20aa066c7171e8d948dd06e66d0fbba8f9ea\": container with ID starting with 3506f6b1dfc89c7ae694371d02ca20aa066c7171e8d948dd06e66d0fbba8f9ea not found: ID does not exist" containerID="3506f6b1dfc89c7ae694371d02ca20aa066c7171e8d948dd06e66d0fbba8f9ea" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.789898 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3506f6b1dfc89c7ae694371d02ca20aa066c7171e8d948dd06e66d0fbba8f9ea"} err="failed to get container status \"3506f6b1dfc89c7ae694371d02ca20aa066c7171e8d948dd06e66d0fbba8f9ea\": rpc error: code = NotFound desc = could not find container \"3506f6b1dfc89c7ae694371d02ca20aa066c7171e8d948dd06e66d0fbba8f9ea\": container with ID starting with 3506f6b1dfc89c7ae694371d02ca20aa066c7171e8d948dd06e66d0fbba8f9ea not found: ID does not exist" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.789913 4974 scope.go:117] "RemoveContainer" containerID="8ce1551359f8877fa293bbc9b3b99ef202f6949fd700c4bb270cf8da7f342e01" Oct 13 19:43:09 crc kubenswrapper[4974]: E1013 19:43:09.790198 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ce1551359f8877fa293bbc9b3b99ef202f6949fd700c4bb270cf8da7f342e01\": container with ID starting with 8ce1551359f8877fa293bbc9b3b99ef202f6949fd700c4bb270cf8da7f342e01 not found: ID does not exist" containerID="8ce1551359f8877fa293bbc9b3b99ef202f6949fd700c4bb270cf8da7f342e01" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.790220 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ce1551359f8877fa293bbc9b3b99ef202f6949fd700c4bb270cf8da7f342e01"} err="failed to get container status \"8ce1551359f8877fa293bbc9b3b99ef202f6949fd700c4bb270cf8da7f342e01\": rpc error: code = NotFound desc = could not find container \"8ce1551359f8877fa293bbc9b3b99ef202f6949fd700c4bb270cf8da7f342e01\": container with ID starting with 8ce1551359f8877fa293bbc9b3b99ef202f6949fd700c4bb270cf8da7f342e01 not found: ID does not exist" Oct 13 19:43:09 crc kubenswrapper[4974]: I1013 19:43:09.823481 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b39d0e3a-aa36-4c55-8aa3-67e214236562" path="/var/lib/kubelet/pods/b39d0e3a-aa36-4c55-8aa3-67e214236562/volumes" Oct 13 19:43:17 crc kubenswrapper[4974]: I1013 19:43:17.813315 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:43:17 crc kubenswrapper[4974]: E1013 19:43:17.814240 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:43:28 crc kubenswrapper[4974]: I1013 19:43:28.811837 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:43:28 crc kubenswrapper[4974]: E1013 19:43:28.813104 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:43:42 crc kubenswrapper[4974]: I1013 19:43:42.811854 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:43:42 crc kubenswrapper[4974]: E1013 19:43:42.812988 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:43:57 crc kubenswrapper[4974]: I1013 19:43:57.812244 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:43:57 crc kubenswrapper[4974]: E1013 19:43:57.812990 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:44:09 crc kubenswrapper[4974]: I1013 19:44:09.812151 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:44:09 crc kubenswrapper[4974]: E1013 19:44:09.813103 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:44:23 crc kubenswrapper[4974]: I1013 19:44:23.812830 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:44:23 crc kubenswrapper[4974]: E1013 19:44:23.815417 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:44:37 crc kubenswrapper[4974]: I1013 19:44:37.812445 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:44:37 crc kubenswrapper[4974]: E1013 19:44:37.813961 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:44:48 crc kubenswrapper[4974]: I1013 19:44:48.811397 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:44:48 crc kubenswrapper[4974]: E1013 19:44:48.812179 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.173400 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr"] Oct 13 19:45:00 crc kubenswrapper[4974]: E1013 19:45:00.175266 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerName="extract-content" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.175351 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerName="extract-content" Oct 13 19:45:00 crc kubenswrapper[4974]: E1013 19:45:00.175439 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerName="extract-utilities" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.175497 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerName="extract-utilities" Oct 13 19:45:00 crc kubenswrapper[4974]: E1013 19:45:00.175565 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerName="registry-server" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.175621 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerName="registry-server" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.175891 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39d0e3a-aa36-4c55-8aa3-67e214236562" containerName="registry-server" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.176630 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.184833 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.187128 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.192397 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr"] Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.246410 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czjcj\" (UniqueName: \"kubernetes.io/projected/b9c433b8-3091-4128-ba86-64efe088e603-kube-api-access-czjcj\") pod \"collect-profiles-29339745-5tfrr\" (UID: \"b9c433b8-3091-4128-ba86-64efe088e603\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.246511 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9c433b8-3091-4128-ba86-64efe088e603-config-volume\") pod \"collect-profiles-29339745-5tfrr\" (UID: \"b9c433b8-3091-4128-ba86-64efe088e603\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.246682 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9c433b8-3091-4128-ba86-64efe088e603-secret-volume\") pod \"collect-profiles-29339745-5tfrr\" (UID: \"b9c433b8-3091-4128-ba86-64efe088e603\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.349023 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czjcj\" (UniqueName: \"kubernetes.io/projected/b9c433b8-3091-4128-ba86-64efe088e603-kube-api-access-czjcj\") pod \"collect-profiles-29339745-5tfrr\" (UID: \"b9c433b8-3091-4128-ba86-64efe088e603\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.349103 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9c433b8-3091-4128-ba86-64efe088e603-config-volume\") pod \"collect-profiles-29339745-5tfrr\" (UID: \"b9c433b8-3091-4128-ba86-64efe088e603\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.349160 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9c433b8-3091-4128-ba86-64efe088e603-secret-volume\") pod \"collect-profiles-29339745-5tfrr\" (UID: \"b9c433b8-3091-4128-ba86-64efe088e603\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.350466 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9c433b8-3091-4128-ba86-64efe088e603-config-volume\") pod \"collect-profiles-29339745-5tfrr\" (UID: \"b9c433b8-3091-4128-ba86-64efe088e603\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.366071 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9c433b8-3091-4128-ba86-64efe088e603-secret-volume\") pod \"collect-profiles-29339745-5tfrr\" (UID: \"b9c433b8-3091-4128-ba86-64efe088e603\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.369836 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czjcj\" (UniqueName: \"kubernetes.io/projected/b9c433b8-3091-4128-ba86-64efe088e603-kube-api-access-czjcj\") pod \"collect-profiles-29339745-5tfrr\" (UID: \"b9c433b8-3091-4128-ba86-64efe088e603\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.500159 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:00 crc kubenswrapper[4974]: I1013 19:45:00.998742 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr"] Oct 13 19:45:01 crc kubenswrapper[4974]: I1013 19:45:01.811378 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:45:01 crc kubenswrapper[4974]: E1013 19:45:01.812116 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:45:02 crc kubenswrapper[4974]: I1013 19:45:02.016323 4974 generic.go:334] "Generic (PLEG): container finished" podID="b9c433b8-3091-4128-ba86-64efe088e603" containerID="2128d19b60b629c93d6961dec352b88ab70d61900a1e2f72e941460983baa54c" exitCode=0 Oct 13 19:45:02 crc kubenswrapper[4974]: I1013 19:45:02.016395 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" event={"ID":"b9c433b8-3091-4128-ba86-64efe088e603","Type":"ContainerDied","Data":"2128d19b60b629c93d6961dec352b88ab70d61900a1e2f72e941460983baa54c"} Oct 13 19:45:02 crc kubenswrapper[4974]: I1013 19:45:02.016564 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" event={"ID":"b9c433b8-3091-4128-ba86-64efe088e603","Type":"ContainerStarted","Data":"18d41195b7bdbf56af20a8f41fe2fc4f9b4c4a51a4cb060f7c2952137ef550b7"} Oct 13 19:45:03 crc kubenswrapper[4974]: I1013 19:45:03.497952 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:03 crc kubenswrapper[4974]: I1013 19:45:03.631711 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9c433b8-3091-4128-ba86-64efe088e603-config-volume\") pod \"b9c433b8-3091-4128-ba86-64efe088e603\" (UID: \"b9c433b8-3091-4128-ba86-64efe088e603\") " Oct 13 19:45:03 crc kubenswrapper[4974]: I1013 19:45:03.631837 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czjcj\" (UniqueName: \"kubernetes.io/projected/b9c433b8-3091-4128-ba86-64efe088e603-kube-api-access-czjcj\") pod \"b9c433b8-3091-4128-ba86-64efe088e603\" (UID: \"b9c433b8-3091-4128-ba86-64efe088e603\") " Oct 13 19:45:03 crc kubenswrapper[4974]: I1013 19:45:03.631912 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9c433b8-3091-4128-ba86-64efe088e603-secret-volume\") pod \"b9c433b8-3091-4128-ba86-64efe088e603\" (UID: \"b9c433b8-3091-4128-ba86-64efe088e603\") " Oct 13 19:45:03 crc kubenswrapper[4974]: I1013 19:45:03.632251 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9c433b8-3091-4128-ba86-64efe088e603-config-volume" (OuterVolumeSpecName: "config-volume") pod "b9c433b8-3091-4128-ba86-64efe088e603" (UID: "b9c433b8-3091-4128-ba86-64efe088e603"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 19:45:03 crc kubenswrapper[4974]: I1013 19:45:03.632355 4974 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9c433b8-3091-4128-ba86-64efe088e603-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 19:45:03 crc kubenswrapper[4974]: I1013 19:45:03.637495 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9c433b8-3091-4128-ba86-64efe088e603-kube-api-access-czjcj" (OuterVolumeSpecName: "kube-api-access-czjcj") pod "b9c433b8-3091-4128-ba86-64efe088e603" (UID: "b9c433b8-3091-4128-ba86-64efe088e603"). InnerVolumeSpecName "kube-api-access-czjcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:45:03 crc kubenswrapper[4974]: I1013 19:45:03.638027 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9c433b8-3091-4128-ba86-64efe088e603-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b9c433b8-3091-4128-ba86-64efe088e603" (UID: "b9c433b8-3091-4128-ba86-64efe088e603"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:45:03 crc kubenswrapper[4974]: I1013 19:45:03.735953 4974 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9c433b8-3091-4128-ba86-64efe088e603-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 19:45:03 crc kubenswrapper[4974]: I1013 19:45:03.736018 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czjcj\" (UniqueName: \"kubernetes.io/projected/b9c433b8-3091-4128-ba86-64efe088e603-kube-api-access-czjcj\") on node \"crc\" DevicePath \"\"" Oct 13 19:45:04 crc kubenswrapper[4974]: I1013 19:45:04.050935 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" event={"ID":"b9c433b8-3091-4128-ba86-64efe088e603","Type":"ContainerDied","Data":"18d41195b7bdbf56af20a8f41fe2fc4f9b4c4a51a4cb060f7c2952137ef550b7"} Oct 13 19:45:04 crc kubenswrapper[4974]: I1013 19:45:04.050983 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18d41195b7bdbf56af20a8f41fe2fc4f9b4c4a51a4cb060f7c2952137ef550b7" Oct 13 19:45:04 crc kubenswrapper[4974]: I1013 19:45:04.051035 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339745-5tfrr" Oct 13 19:45:04 crc kubenswrapper[4974]: I1013 19:45:04.601505 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88"] Oct 13 19:45:04 crc kubenswrapper[4974]: I1013 19:45:04.613996 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339700-vbj88"] Oct 13 19:45:05 crc kubenswrapper[4974]: I1013 19:45:05.835917 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="881aeb7e-48b8-49e3-8b72-3fc27951a12d" path="/var/lib/kubelet/pods/881aeb7e-48b8-49e3-8b72-3fc27951a12d/volumes" Oct 13 19:45:15 crc kubenswrapper[4974]: I1013 19:45:15.830861 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:45:15 crc kubenswrapper[4974]: E1013 19:45:15.831585 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:45:29 crc kubenswrapper[4974]: I1013 19:45:29.812952 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:45:29 crc kubenswrapper[4974]: E1013 19:45:29.813706 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:45:44 crc kubenswrapper[4974]: I1013 19:45:44.815512 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:45:44 crc kubenswrapper[4974]: E1013 19:45:44.816410 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:45:51 crc kubenswrapper[4974]: I1013 19:45:51.814783 4974 scope.go:117] "RemoveContainer" containerID="cf3ec6edb505ad49f02a0767ca960dbdf952cdd9a64be55d447881ebcef746f4" Oct 13 19:45:59 crc kubenswrapper[4974]: I1013 19:45:59.811995 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:45:59 crc kubenswrapper[4974]: E1013 19:45:59.812913 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:46:10 crc kubenswrapper[4974]: I1013 19:46:10.813625 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:46:10 crc kubenswrapper[4974]: E1013 19:46:10.814710 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:46:23 crc kubenswrapper[4974]: I1013 19:46:23.819638 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:46:23 crc kubenswrapper[4974]: E1013 19:46:23.820918 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:46:37 crc kubenswrapper[4974]: I1013 19:46:37.811550 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:46:37 crc kubenswrapper[4974]: E1013 19:46:37.812527 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:46:48 crc kubenswrapper[4974]: I1013 19:46:48.811744 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:46:48 crc kubenswrapper[4974]: E1013 19:46:48.812643 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:47:00 crc kubenswrapper[4974]: I1013 19:47:00.812518 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:47:00 crc kubenswrapper[4974]: E1013 19:47:00.813794 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:47:12 crc kubenswrapper[4974]: I1013 19:47:12.813268 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:47:12 crc kubenswrapper[4974]: E1013 19:47:12.814799 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.251260 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rqh87"] Oct 13 19:47:17 crc kubenswrapper[4974]: E1013 19:47:17.252251 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9c433b8-3091-4128-ba86-64efe088e603" containerName="collect-profiles" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.252264 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9c433b8-3091-4128-ba86-64efe088e603" containerName="collect-profiles" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.252442 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9c433b8-3091-4128-ba86-64efe088e603" containerName="collect-profiles" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.253811 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.275086 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rqh87"] Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.397569 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/138b2803-1c8f-493a-aa52-e223e4285bdf-utilities\") pod \"community-operators-rqh87\" (UID: \"138b2803-1c8f-493a-aa52-e223e4285bdf\") " pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.397708 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ns57\" (UniqueName: \"kubernetes.io/projected/138b2803-1c8f-493a-aa52-e223e4285bdf-kube-api-access-5ns57\") pod \"community-operators-rqh87\" (UID: \"138b2803-1c8f-493a-aa52-e223e4285bdf\") " pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.397744 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/138b2803-1c8f-493a-aa52-e223e4285bdf-catalog-content\") pod \"community-operators-rqh87\" (UID: \"138b2803-1c8f-493a-aa52-e223e4285bdf\") " pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.499599 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/138b2803-1c8f-493a-aa52-e223e4285bdf-utilities\") pod \"community-operators-rqh87\" (UID: \"138b2803-1c8f-493a-aa52-e223e4285bdf\") " pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.499795 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ns57\" (UniqueName: \"kubernetes.io/projected/138b2803-1c8f-493a-aa52-e223e4285bdf-kube-api-access-5ns57\") pod \"community-operators-rqh87\" (UID: \"138b2803-1c8f-493a-aa52-e223e4285bdf\") " pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.499825 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/138b2803-1c8f-493a-aa52-e223e4285bdf-catalog-content\") pod \"community-operators-rqh87\" (UID: \"138b2803-1c8f-493a-aa52-e223e4285bdf\") " pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.500404 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/138b2803-1c8f-493a-aa52-e223e4285bdf-utilities\") pod \"community-operators-rqh87\" (UID: \"138b2803-1c8f-493a-aa52-e223e4285bdf\") " pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.500454 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/138b2803-1c8f-493a-aa52-e223e4285bdf-catalog-content\") pod \"community-operators-rqh87\" (UID: \"138b2803-1c8f-493a-aa52-e223e4285bdf\") " pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.525437 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ns57\" (UniqueName: \"kubernetes.io/projected/138b2803-1c8f-493a-aa52-e223e4285bdf-kube-api-access-5ns57\") pod \"community-operators-rqh87\" (UID: \"138b2803-1c8f-493a-aa52-e223e4285bdf\") " pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:17 crc kubenswrapper[4974]: I1013 19:47:17.581614 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:18 crc kubenswrapper[4974]: I1013 19:47:18.184362 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rqh87"] Oct 13 19:47:18 crc kubenswrapper[4974]: W1013 19:47:18.199589 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod138b2803_1c8f_493a_aa52_e223e4285bdf.slice/crio-4a0cb8cce7fec67a3313b0ca090421b378bddafa70a8ef91c782e5a41305593d WatchSource:0}: Error finding container 4a0cb8cce7fec67a3313b0ca090421b378bddafa70a8ef91c782e5a41305593d: Status 404 returned error can't find the container with id 4a0cb8cce7fec67a3313b0ca090421b378bddafa70a8ef91c782e5a41305593d Oct 13 19:47:18 crc kubenswrapper[4974]: I1013 19:47:18.786111 4974 generic.go:334] "Generic (PLEG): container finished" podID="138b2803-1c8f-493a-aa52-e223e4285bdf" containerID="0c77733b76bb402e08018488fb7f62bd5ca1d94a761180ce68ca80dc6510cbcc" exitCode=0 Oct 13 19:47:18 crc kubenswrapper[4974]: I1013 19:47:18.786197 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqh87" event={"ID":"138b2803-1c8f-493a-aa52-e223e4285bdf","Type":"ContainerDied","Data":"0c77733b76bb402e08018488fb7f62bd5ca1d94a761180ce68ca80dc6510cbcc"} Oct 13 19:47:18 crc kubenswrapper[4974]: I1013 19:47:18.786493 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqh87" event={"ID":"138b2803-1c8f-493a-aa52-e223e4285bdf","Type":"ContainerStarted","Data":"4a0cb8cce7fec67a3313b0ca090421b378bddafa70a8ef91c782e5a41305593d"} Oct 13 19:47:18 crc kubenswrapper[4974]: I1013 19:47:18.790321 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 19:47:20 crc kubenswrapper[4974]: I1013 19:47:20.809494 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqh87" event={"ID":"138b2803-1c8f-493a-aa52-e223e4285bdf","Type":"ContainerStarted","Data":"dc93931d31bfbcbb610ea7c6d18601b8d2d605051e9ea1899af90c06df3415cd"} Oct 13 19:47:21 crc kubenswrapper[4974]: I1013 19:47:21.829448 4974 generic.go:334] "Generic (PLEG): container finished" podID="138b2803-1c8f-493a-aa52-e223e4285bdf" containerID="dc93931d31bfbcbb610ea7c6d18601b8d2d605051e9ea1899af90c06df3415cd" exitCode=0 Oct 13 19:47:21 crc kubenswrapper[4974]: I1013 19:47:21.844781 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqh87" event={"ID":"138b2803-1c8f-493a-aa52-e223e4285bdf","Type":"ContainerDied","Data":"dc93931d31bfbcbb610ea7c6d18601b8d2d605051e9ea1899af90c06df3415cd"} Oct 13 19:47:22 crc kubenswrapper[4974]: I1013 19:47:22.844049 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqh87" event={"ID":"138b2803-1c8f-493a-aa52-e223e4285bdf","Type":"ContainerStarted","Data":"e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901"} Oct 13 19:47:22 crc kubenswrapper[4974]: I1013 19:47:22.871751 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rqh87" podStartSLOduration=2.150402471 podStartE2EDuration="5.871732948s" podCreationTimestamp="2025-10-13 19:47:17 +0000 UTC" firstStartedPulling="2025-10-13 19:47:18.789905503 +0000 UTC m=+5573.694271613" lastFinishedPulling="2025-10-13 19:47:22.511236 +0000 UTC m=+5577.415602090" observedRunningTime="2025-10-13 19:47:22.86968358 +0000 UTC m=+5577.774049670" watchObservedRunningTime="2025-10-13 19:47:22.871732948 +0000 UTC m=+5577.776099038" Oct 13 19:47:24 crc kubenswrapper[4974]: I1013 19:47:24.811980 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:47:24 crc kubenswrapper[4974]: E1013 19:47:24.812469 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:47:27 crc kubenswrapper[4974]: I1013 19:47:27.582323 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:27 crc kubenswrapper[4974]: I1013 19:47:27.584647 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:28 crc kubenswrapper[4974]: I1013 19:47:28.382638 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:29 crc kubenswrapper[4974]: I1013 19:47:29.010881 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:29 crc kubenswrapper[4974]: I1013 19:47:29.090982 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rqh87"] Oct 13 19:47:30 crc kubenswrapper[4974]: I1013 19:47:30.943796 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rqh87" podUID="138b2803-1c8f-493a-aa52-e223e4285bdf" containerName="registry-server" containerID="cri-o://e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901" gracePeriod=2 Oct 13 19:47:31 crc kubenswrapper[4974]: I1013 19:47:31.846594 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:31 crc kubenswrapper[4974]: I1013 19:47:31.956667 4974 generic.go:334] "Generic (PLEG): container finished" podID="138b2803-1c8f-493a-aa52-e223e4285bdf" containerID="e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901" exitCode=0 Oct 13 19:47:31 crc kubenswrapper[4974]: I1013 19:47:31.956708 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqh87" event={"ID":"138b2803-1c8f-493a-aa52-e223e4285bdf","Type":"ContainerDied","Data":"e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901"} Oct 13 19:47:31 crc kubenswrapper[4974]: I1013 19:47:31.956734 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rqh87" Oct 13 19:47:31 crc kubenswrapper[4974]: I1013 19:47:31.956760 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rqh87" event={"ID":"138b2803-1c8f-493a-aa52-e223e4285bdf","Type":"ContainerDied","Data":"4a0cb8cce7fec67a3313b0ca090421b378bddafa70a8ef91c782e5a41305593d"} Oct 13 19:47:31 crc kubenswrapper[4974]: I1013 19:47:31.956778 4974 scope.go:117] "RemoveContainer" containerID="e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901" Oct 13 19:47:31 crc kubenswrapper[4974]: I1013 19:47:31.964591 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ns57\" (UniqueName: \"kubernetes.io/projected/138b2803-1c8f-493a-aa52-e223e4285bdf-kube-api-access-5ns57\") pod \"138b2803-1c8f-493a-aa52-e223e4285bdf\" (UID: \"138b2803-1c8f-493a-aa52-e223e4285bdf\") " Oct 13 19:47:31 crc kubenswrapper[4974]: I1013 19:47:31.964756 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/138b2803-1c8f-493a-aa52-e223e4285bdf-catalog-content\") pod \"138b2803-1c8f-493a-aa52-e223e4285bdf\" (UID: \"138b2803-1c8f-493a-aa52-e223e4285bdf\") " Oct 13 19:47:31 crc kubenswrapper[4974]: I1013 19:47:31.964827 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/138b2803-1c8f-493a-aa52-e223e4285bdf-utilities\") pod \"138b2803-1c8f-493a-aa52-e223e4285bdf\" (UID: \"138b2803-1c8f-493a-aa52-e223e4285bdf\") " Oct 13 19:47:31 crc kubenswrapper[4974]: I1013 19:47:31.965494 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/138b2803-1c8f-493a-aa52-e223e4285bdf-utilities" (OuterVolumeSpecName: "utilities") pod "138b2803-1c8f-493a-aa52-e223e4285bdf" (UID: "138b2803-1c8f-493a-aa52-e223e4285bdf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:47:31 crc kubenswrapper[4974]: I1013 19:47:31.970828 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/138b2803-1c8f-493a-aa52-e223e4285bdf-kube-api-access-5ns57" (OuterVolumeSpecName: "kube-api-access-5ns57") pod "138b2803-1c8f-493a-aa52-e223e4285bdf" (UID: "138b2803-1c8f-493a-aa52-e223e4285bdf"). InnerVolumeSpecName "kube-api-access-5ns57". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:47:31 crc kubenswrapper[4974]: I1013 19:47:31.978497 4974 scope.go:117] "RemoveContainer" containerID="dc93931d31bfbcbb610ea7c6d18601b8d2d605051e9ea1899af90c06df3415cd" Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.015676 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/138b2803-1c8f-493a-aa52-e223e4285bdf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "138b2803-1c8f-493a-aa52-e223e4285bdf" (UID: "138b2803-1c8f-493a-aa52-e223e4285bdf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.036572 4974 scope.go:117] "RemoveContainer" containerID="0c77733b76bb402e08018488fb7f62bd5ca1d94a761180ce68ca80dc6510cbcc" Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.067262 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/138b2803-1c8f-493a-aa52-e223e4285bdf-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.067305 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/138b2803-1c8f-493a-aa52-e223e4285bdf-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.067320 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ns57\" (UniqueName: \"kubernetes.io/projected/138b2803-1c8f-493a-aa52-e223e4285bdf-kube-api-access-5ns57\") on node \"crc\" DevicePath \"\"" Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.096955 4974 scope.go:117] "RemoveContainer" containerID="e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901" Oct 13 19:47:32 crc kubenswrapper[4974]: E1013 19:47:32.098148 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901\": container with ID starting with e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901 not found: ID does not exist" containerID="e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901" Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.098215 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901"} err="failed to get container status \"e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901\": rpc error: code = NotFound desc = could not find container \"e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901\": container with ID starting with e0174db326fb1f2cce92fa89bcfe8de008cd3b2ba10cc331079497c66c6b3901 not found: ID does not exist" Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.098256 4974 scope.go:117] "RemoveContainer" containerID="dc93931d31bfbcbb610ea7c6d18601b8d2d605051e9ea1899af90c06df3415cd" Oct 13 19:47:32 crc kubenswrapper[4974]: E1013 19:47:32.098868 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc93931d31bfbcbb610ea7c6d18601b8d2d605051e9ea1899af90c06df3415cd\": container with ID starting with dc93931d31bfbcbb610ea7c6d18601b8d2d605051e9ea1899af90c06df3415cd not found: ID does not exist" containerID="dc93931d31bfbcbb610ea7c6d18601b8d2d605051e9ea1899af90c06df3415cd" Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.098907 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc93931d31bfbcbb610ea7c6d18601b8d2d605051e9ea1899af90c06df3415cd"} err="failed to get container status \"dc93931d31bfbcbb610ea7c6d18601b8d2d605051e9ea1899af90c06df3415cd\": rpc error: code = NotFound desc = could not find container \"dc93931d31bfbcbb610ea7c6d18601b8d2d605051e9ea1899af90c06df3415cd\": container with ID starting with dc93931d31bfbcbb610ea7c6d18601b8d2d605051e9ea1899af90c06df3415cd not found: ID does not exist" Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.098933 4974 scope.go:117] "RemoveContainer" containerID="0c77733b76bb402e08018488fb7f62bd5ca1d94a761180ce68ca80dc6510cbcc" Oct 13 19:47:32 crc kubenswrapper[4974]: E1013 19:47:32.099286 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c77733b76bb402e08018488fb7f62bd5ca1d94a761180ce68ca80dc6510cbcc\": container with ID starting with 0c77733b76bb402e08018488fb7f62bd5ca1d94a761180ce68ca80dc6510cbcc not found: ID does not exist" containerID="0c77733b76bb402e08018488fb7f62bd5ca1d94a761180ce68ca80dc6510cbcc" Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.099331 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c77733b76bb402e08018488fb7f62bd5ca1d94a761180ce68ca80dc6510cbcc"} err="failed to get container status \"0c77733b76bb402e08018488fb7f62bd5ca1d94a761180ce68ca80dc6510cbcc\": rpc error: code = NotFound desc = could not find container \"0c77733b76bb402e08018488fb7f62bd5ca1d94a761180ce68ca80dc6510cbcc\": container with ID starting with 0c77733b76bb402e08018488fb7f62bd5ca1d94a761180ce68ca80dc6510cbcc not found: ID does not exist" Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.300346 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rqh87"] Oct 13 19:47:32 crc kubenswrapper[4974]: I1013 19:47:32.315411 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rqh87"] Oct 13 19:47:33 crc kubenswrapper[4974]: I1013 19:47:33.832861 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="138b2803-1c8f-493a-aa52-e223e4285bdf" path="/var/lib/kubelet/pods/138b2803-1c8f-493a-aa52-e223e4285bdf/volumes" Oct 13 19:47:38 crc kubenswrapper[4974]: I1013 19:47:38.814834 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:47:40 crc kubenswrapper[4974]: I1013 19:47:40.079601 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"aacae48f28a8d16fcda75b3e9bb8a19abe359616918070df5b23b5f7d085a4ab"} Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.741901 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vwd2l"] Oct 13 19:48:06 crc kubenswrapper[4974]: E1013 19:48:06.743186 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138b2803-1c8f-493a-aa52-e223e4285bdf" containerName="extract-utilities" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.743208 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="138b2803-1c8f-493a-aa52-e223e4285bdf" containerName="extract-utilities" Oct 13 19:48:06 crc kubenswrapper[4974]: E1013 19:48:06.743269 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138b2803-1c8f-493a-aa52-e223e4285bdf" containerName="registry-server" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.743281 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="138b2803-1c8f-493a-aa52-e223e4285bdf" containerName="registry-server" Oct 13 19:48:06 crc kubenswrapper[4974]: E1013 19:48:06.743301 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="138b2803-1c8f-493a-aa52-e223e4285bdf" containerName="extract-content" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.743313 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="138b2803-1c8f-493a-aa52-e223e4285bdf" containerName="extract-content" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.743646 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="138b2803-1c8f-493a-aa52-e223e4285bdf" containerName="registry-server" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.746315 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.754753 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwd2l"] Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.840424 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/734700c6-71f6-41f7-ad56-0a3f8ab61256-catalog-content\") pod \"redhat-marketplace-vwd2l\" (UID: \"734700c6-71f6-41f7-ad56-0a3f8ab61256\") " pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.840491 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/734700c6-71f6-41f7-ad56-0a3f8ab61256-utilities\") pod \"redhat-marketplace-vwd2l\" (UID: \"734700c6-71f6-41f7-ad56-0a3f8ab61256\") " pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.840681 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxslx\" (UniqueName: \"kubernetes.io/projected/734700c6-71f6-41f7-ad56-0a3f8ab61256-kube-api-access-cxslx\") pod \"redhat-marketplace-vwd2l\" (UID: \"734700c6-71f6-41f7-ad56-0a3f8ab61256\") " pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.942483 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxslx\" (UniqueName: \"kubernetes.io/projected/734700c6-71f6-41f7-ad56-0a3f8ab61256-kube-api-access-cxslx\") pod \"redhat-marketplace-vwd2l\" (UID: \"734700c6-71f6-41f7-ad56-0a3f8ab61256\") " pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.943011 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/734700c6-71f6-41f7-ad56-0a3f8ab61256-catalog-content\") pod \"redhat-marketplace-vwd2l\" (UID: \"734700c6-71f6-41f7-ad56-0a3f8ab61256\") " pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.943833 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/734700c6-71f6-41f7-ad56-0a3f8ab61256-catalog-content\") pod \"redhat-marketplace-vwd2l\" (UID: \"734700c6-71f6-41f7-ad56-0a3f8ab61256\") " pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.944184 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/734700c6-71f6-41f7-ad56-0a3f8ab61256-utilities\") pod \"redhat-marketplace-vwd2l\" (UID: \"734700c6-71f6-41f7-ad56-0a3f8ab61256\") " pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.944243 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/734700c6-71f6-41f7-ad56-0a3f8ab61256-utilities\") pod \"redhat-marketplace-vwd2l\" (UID: \"734700c6-71f6-41f7-ad56-0a3f8ab61256\") " pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:06 crc kubenswrapper[4974]: I1013 19:48:06.974601 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxslx\" (UniqueName: \"kubernetes.io/projected/734700c6-71f6-41f7-ad56-0a3f8ab61256-kube-api-access-cxslx\") pod \"redhat-marketplace-vwd2l\" (UID: \"734700c6-71f6-41f7-ad56-0a3f8ab61256\") " pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:07 crc kubenswrapper[4974]: I1013 19:48:07.078636 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:07 crc kubenswrapper[4974]: W1013 19:48:07.679357 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod734700c6_71f6_41f7_ad56_0a3f8ab61256.slice/crio-d4aa0dab5e665ed2a6c804807c0b6a5873859c6893e117b230aff89d62d34202 WatchSource:0}: Error finding container d4aa0dab5e665ed2a6c804807c0b6a5873859c6893e117b230aff89d62d34202: Status 404 returned error can't find the container with id d4aa0dab5e665ed2a6c804807c0b6a5873859c6893e117b230aff89d62d34202 Oct 13 19:48:07 crc kubenswrapper[4974]: I1013 19:48:07.686223 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwd2l"] Oct 13 19:48:08 crc kubenswrapper[4974]: I1013 19:48:08.414965 4974 generic.go:334] "Generic (PLEG): container finished" podID="734700c6-71f6-41f7-ad56-0a3f8ab61256" containerID="61de0f9a50dcb7015d1642db81ba160eedb86ee52f2bd816bd87ab4acfb9cb16" exitCode=0 Oct 13 19:48:08 crc kubenswrapper[4974]: I1013 19:48:08.415027 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwd2l" event={"ID":"734700c6-71f6-41f7-ad56-0a3f8ab61256","Type":"ContainerDied","Data":"61de0f9a50dcb7015d1642db81ba160eedb86ee52f2bd816bd87ab4acfb9cb16"} Oct 13 19:48:08 crc kubenswrapper[4974]: I1013 19:48:08.415338 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwd2l" event={"ID":"734700c6-71f6-41f7-ad56-0a3f8ab61256","Type":"ContainerStarted","Data":"d4aa0dab5e665ed2a6c804807c0b6a5873859c6893e117b230aff89d62d34202"} Oct 13 19:48:09 crc kubenswrapper[4974]: I1013 19:48:09.432036 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwd2l" event={"ID":"734700c6-71f6-41f7-ad56-0a3f8ab61256","Type":"ContainerStarted","Data":"5c1df5744e02b90a4c3d3f326c2d8e9ed7ee5ed13674bb68ef983c4f9c79bc39"} Oct 13 19:48:10 crc kubenswrapper[4974]: I1013 19:48:10.450865 4974 generic.go:334] "Generic (PLEG): container finished" podID="734700c6-71f6-41f7-ad56-0a3f8ab61256" containerID="5c1df5744e02b90a4c3d3f326c2d8e9ed7ee5ed13674bb68ef983c4f9c79bc39" exitCode=0 Oct 13 19:48:10 crc kubenswrapper[4974]: I1013 19:48:10.450912 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwd2l" event={"ID":"734700c6-71f6-41f7-ad56-0a3f8ab61256","Type":"ContainerDied","Data":"5c1df5744e02b90a4c3d3f326c2d8e9ed7ee5ed13674bb68ef983c4f9c79bc39"} Oct 13 19:48:11 crc kubenswrapper[4974]: I1013 19:48:11.462435 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwd2l" event={"ID":"734700c6-71f6-41f7-ad56-0a3f8ab61256","Type":"ContainerStarted","Data":"290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7"} Oct 13 19:48:11 crc kubenswrapper[4974]: I1013 19:48:11.489095 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vwd2l" podStartSLOduration=3.002121803 podStartE2EDuration="5.489080803s" podCreationTimestamp="2025-10-13 19:48:06 +0000 UTC" firstStartedPulling="2025-10-13 19:48:08.418194586 +0000 UTC m=+5623.322560696" lastFinishedPulling="2025-10-13 19:48:10.905153596 +0000 UTC m=+5625.809519696" observedRunningTime="2025-10-13 19:48:11.486451109 +0000 UTC m=+5626.390817189" watchObservedRunningTime="2025-10-13 19:48:11.489080803 +0000 UTC m=+5626.393446883" Oct 13 19:48:17 crc kubenswrapper[4974]: I1013 19:48:17.079841 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:17 crc kubenswrapper[4974]: I1013 19:48:17.080680 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:17 crc kubenswrapper[4974]: I1013 19:48:17.145207 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:17 crc kubenswrapper[4974]: I1013 19:48:17.597857 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:17 crc kubenswrapper[4974]: I1013 19:48:17.639957 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwd2l"] Oct 13 19:48:19 crc kubenswrapper[4974]: I1013 19:48:19.550607 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vwd2l" podUID="734700c6-71f6-41f7-ad56-0a3f8ab61256" containerName="registry-server" containerID="cri-o://290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7" gracePeriod=2 Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.079004 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.157679 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxslx\" (UniqueName: \"kubernetes.io/projected/734700c6-71f6-41f7-ad56-0a3f8ab61256-kube-api-access-cxslx\") pod \"734700c6-71f6-41f7-ad56-0a3f8ab61256\" (UID: \"734700c6-71f6-41f7-ad56-0a3f8ab61256\") " Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.157795 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/734700c6-71f6-41f7-ad56-0a3f8ab61256-utilities\") pod \"734700c6-71f6-41f7-ad56-0a3f8ab61256\" (UID: \"734700c6-71f6-41f7-ad56-0a3f8ab61256\") " Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.157889 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/734700c6-71f6-41f7-ad56-0a3f8ab61256-catalog-content\") pod \"734700c6-71f6-41f7-ad56-0a3f8ab61256\" (UID: \"734700c6-71f6-41f7-ad56-0a3f8ab61256\") " Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.159932 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/734700c6-71f6-41f7-ad56-0a3f8ab61256-utilities" (OuterVolumeSpecName: "utilities") pod "734700c6-71f6-41f7-ad56-0a3f8ab61256" (UID: "734700c6-71f6-41f7-ad56-0a3f8ab61256"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.166478 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/734700c6-71f6-41f7-ad56-0a3f8ab61256-kube-api-access-cxslx" (OuterVolumeSpecName: "kube-api-access-cxslx") pod "734700c6-71f6-41f7-ad56-0a3f8ab61256" (UID: "734700c6-71f6-41f7-ad56-0a3f8ab61256"). InnerVolumeSpecName "kube-api-access-cxslx". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.172859 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/734700c6-71f6-41f7-ad56-0a3f8ab61256-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "734700c6-71f6-41f7-ad56-0a3f8ab61256" (UID: "734700c6-71f6-41f7-ad56-0a3f8ab61256"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.260713 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/734700c6-71f6-41f7-ad56-0a3f8ab61256-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.260753 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/734700c6-71f6-41f7-ad56-0a3f8ab61256-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.260769 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxslx\" (UniqueName: \"kubernetes.io/projected/734700c6-71f6-41f7-ad56-0a3f8ab61256-kube-api-access-cxslx\") on node \"crc\" DevicePath \"\"" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.566844 4974 generic.go:334] "Generic (PLEG): container finished" podID="734700c6-71f6-41f7-ad56-0a3f8ab61256" containerID="290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7" exitCode=0 Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.567002 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vwd2l" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.567031 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwd2l" event={"ID":"734700c6-71f6-41f7-ad56-0a3f8ab61256","Type":"ContainerDied","Data":"290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7"} Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.567994 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vwd2l" event={"ID":"734700c6-71f6-41f7-ad56-0a3f8ab61256","Type":"ContainerDied","Data":"d4aa0dab5e665ed2a6c804807c0b6a5873859c6893e117b230aff89d62d34202"} Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.568038 4974 scope.go:117] "RemoveContainer" containerID="290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.606113 4974 scope.go:117] "RemoveContainer" containerID="5c1df5744e02b90a4c3d3f326c2d8e9ed7ee5ed13674bb68ef983c4f9c79bc39" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.619906 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwd2l"] Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.633046 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vwd2l"] Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.647866 4974 scope.go:117] "RemoveContainer" containerID="61de0f9a50dcb7015d1642db81ba160eedb86ee52f2bd816bd87ab4acfb9cb16" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.715407 4974 scope.go:117] "RemoveContainer" containerID="290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7" Oct 13 19:48:20 crc kubenswrapper[4974]: E1013 19:48:20.715948 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7\": container with ID starting with 290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7 not found: ID does not exist" containerID="290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.715992 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7"} err="failed to get container status \"290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7\": rpc error: code = NotFound desc = could not find container \"290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7\": container with ID starting with 290440f12d5b6a28aeddc6a228e8b0b79591045f804c1abaff45c32b2f5d87f7 not found: ID does not exist" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.716016 4974 scope.go:117] "RemoveContainer" containerID="5c1df5744e02b90a4c3d3f326c2d8e9ed7ee5ed13674bb68ef983c4f9c79bc39" Oct 13 19:48:20 crc kubenswrapper[4974]: E1013 19:48:20.716412 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c1df5744e02b90a4c3d3f326c2d8e9ed7ee5ed13674bb68ef983c4f9c79bc39\": container with ID starting with 5c1df5744e02b90a4c3d3f326c2d8e9ed7ee5ed13674bb68ef983c4f9c79bc39 not found: ID does not exist" containerID="5c1df5744e02b90a4c3d3f326c2d8e9ed7ee5ed13674bb68ef983c4f9c79bc39" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.716469 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1df5744e02b90a4c3d3f326c2d8e9ed7ee5ed13674bb68ef983c4f9c79bc39"} err="failed to get container status \"5c1df5744e02b90a4c3d3f326c2d8e9ed7ee5ed13674bb68ef983c4f9c79bc39\": rpc error: code = NotFound desc = could not find container \"5c1df5744e02b90a4c3d3f326c2d8e9ed7ee5ed13674bb68ef983c4f9c79bc39\": container with ID starting with 5c1df5744e02b90a4c3d3f326c2d8e9ed7ee5ed13674bb68ef983c4f9c79bc39 not found: ID does not exist" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.716501 4974 scope.go:117] "RemoveContainer" containerID="61de0f9a50dcb7015d1642db81ba160eedb86ee52f2bd816bd87ab4acfb9cb16" Oct 13 19:48:20 crc kubenswrapper[4974]: E1013 19:48:20.716864 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61de0f9a50dcb7015d1642db81ba160eedb86ee52f2bd816bd87ab4acfb9cb16\": container with ID starting with 61de0f9a50dcb7015d1642db81ba160eedb86ee52f2bd816bd87ab4acfb9cb16 not found: ID does not exist" containerID="61de0f9a50dcb7015d1642db81ba160eedb86ee52f2bd816bd87ab4acfb9cb16" Oct 13 19:48:20 crc kubenswrapper[4974]: I1013 19:48:20.716909 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61de0f9a50dcb7015d1642db81ba160eedb86ee52f2bd816bd87ab4acfb9cb16"} err="failed to get container status \"61de0f9a50dcb7015d1642db81ba160eedb86ee52f2bd816bd87ab4acfb9cb16\": rpc error: code = NotFound desc = could not find container \"61de0f9a50dcb7015d1642db81ba160eedb86ee52f2bd816bd87ab4acfb9cb16\": container with ID starting with 61de0f9a50dcb7015d1642db81ba160eedb86ee52f2bd816bd87ab4acfb9cb16 not found: ID does not exist" Oct 13 19:48:21 crc kubenswrapper[4974]: I1013 19:48:21.827496 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="734700c6-71f6-41f7-ad56-0a3f8ab61256" path="/var/lib/kubelet/pods/734700c6-71f6-41f7-ad56-0a3f8ab61256/volumes" Oct 13 19:49:11 crc kubenswrapper[4974]: I1013 19:49:11.235168 4974 generic.go:334] "Generic (PLEG): container finished" podID="98e331dd-24d4-4707-b432-557ea90e6048" containerID="91c6df01012750b43cc81cf7c58c2e656d872828100e2626d6b82a19b1144619" exitCode=0 Oct 13 19:49:11 crc kubenswrapper[4974]: I1013 19:49:11.235245 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"98e331dd-24d4-4707-b432-557ea90e6048","Type":"ContainerDied","Data":"91c6df01012750b43cc81cf7c58c2e656d872828100e2626d6b82a19b1144619"} Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.761448 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.927506 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-ssh-key\") pod \"98e331dd-24d4-4707-b432-557ea90e6048\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.927626 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98e331dd-24d4-4707-b432-557ea90e6048-openstack-config\") pod \"98e331dd-24d4-4707-b432-557ea90e6048\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.927813 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98e331dd-24d4-4707-b432-557ea90e6048-config-data\") pod \"98e331dd-24d4-4707-b432-557ea90e6048\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.927913 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/98e331dd-24d4-4707-b432-557ea90e6048-test-operator-ephemeral-temporary\") pod \"98e331dd-24d4-4707-b432-557ea90e6048\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.928130 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"98e331dd-24d4-4707-b432-557ea90e6048\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.928183 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qtmw\" (UniqueName: \"kubernetes.io/projected/98e331dd-24d4-4707-b432-557ea90e6048-kube-api-access-2qtmw\") pod \"98e331dd-24d4-4707-b432-557ea90e6048\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.928271 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-ca-certs\") pod \"98e331dd-24d4-4707-b432-557ea90e6048\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.928344 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-openstack-config-secret\") pod \"98e331dd-24d4-4707-b432-557ea90e6048\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.928416 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/98e331dd-24d4-4707-b432-557ea90e6048-test-operator-ephemeral-workdir\") pod \"98e331dd-24d4-4707-b432-557ea90e6048\" (UID: \"98e331dd-24d4-4707-b432-557ea90e6048\") " Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.928345 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e331dd-24d4-4707-b432-557ea90e6048-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "98e331dd-24d4-4707-b432-557ea90e6048" (UID: "98e331dd-24d4-4707-b432-557ea90e6048"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.928535 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98e331dd-24d4-4707-b432-557ea90e6048-config-data" (OuterVolumeSpecName: "config-data") pod "98e331dd-24d4-4707-b432-557ea90e6048" (UID: "98e331dd-24d4-4707-b432-557ea90e6048"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.935528 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98e331dd-24d4-4707-b432-557ea90e6048-kube-api-access-2qtmw" (OuterVolumeSpecName: "kube-api-access-2qtmw") pod "98e331dd-24d4-4707-b432-557ea90e6048" (UID: "98e331dd-24d4-4707-b432-557ea90e6048"). InnerVolumeSpecName "kube-api-access-2qtmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.936869 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98e331dd-24d4-4707-b432-557ea90e6048-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "98e331dd-24d4-4707-b432-557ea90e6048" (UID: "98e331dd-24d4-4707-b432-557ea90e6048"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.937035 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "test-operator-logs") pod "98e331dd-24d4-4707-b432-557ea90e6048" (UID: "98e331dd-24d4-4707-b432-557ea90e6048"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.970955 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "98e331dd-24d4-4707-b432-557ea90e6048" (UID: "98e331dd-24d4-4707-b432-557ea90e6048"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.977234 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "98e331dd-24d4-4707-b432-557ea90e6048" (UID: "98e331dd-24d4-4707-b432-557ea90e6048"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:49:12 crc kubenswrapper[4974]: I1013 19:49:12.988086 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "98e331dd-24d4-4707-b432-557ea90e6048" (UID: "98e331dd-24d4-4707-b432-557ea90e6048"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.031767 4974 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-ca-certs\") on node \"crc\" DevicePath \"\"" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.031815 4974 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.031858 4974 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/98e331dd-24d4-4707-b432-557ea90e6048-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.031871 4974 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98e331dd-24d4-4707-b432-557ea90e6048-ssh-key\") on node \"crc\" DevicePath \"\"" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.031884 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/98e331dd-24d4-4707-b432-557ea90e6048-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.031895 4974 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/98e331dd-24d4-4707-b432-557ea90e6048-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.031969 4974 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.032026 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qtmw\" (UniqueName: \"kubernetes.io/projected/98e331dd-24d4-4707-b432-557ea90e6048-kube-api-access-2qtmw\") on node \"crc\" DevicePath \"\"" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.036663 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98e331dd-24d4-4707-b432-557ea90e6048-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "98e331dd-24d4-4707-b432-557ea90e6048" (UID: "98e331dd-24d4-4707-b432-557ea90e6048"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.080620 4974 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.133296 4974 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.133452 4974 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/98e331dd-24d4-4707-b432-557ea90e6048-openstack-config\") on node \"crc\" DevicePath \"\"" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.261149 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"98e331dd-24d4-4707-b432-557ea90e6048","Type":"ContainerDied","Data":"7c76a62293e468edfb8209a299d314723ff77a80209521dc39d72dee32c784df"} Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.261201 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c76a62293e468edfb8209a299d314723ff77a80209521dc39d72dee32c784df" Oct 13 19:49:13 crc kubenswrapper[4974]: I1013 19:49:13.261929 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.176890 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 13 19:49:21 crc kubenswrapper[4974]: E1013 19:49:21.178435 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="734700c6-71f6-41f7-ad56-0a3f8ab61256" containerName="registry-server" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.178463 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="734700c6-71f6-41f7-ad56-0a3f8ab61256" containerName="registry-server" Oct 13 19:49:21 crc kubenswrapper[4974]: E1013 19:49:21.178488 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="734700c6-71f6-41f7-ad56-0a3f8ab61256" containerName="extract-content" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.178501 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="734700c6-71f6-41f7-ad56-0a3f8ab61256" containerName="extract-content" Oct 13 19:49:21 crc kubenswrapper[4974]: E1013 19:49:21.178543 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98e331dd-24d4-4707-b432-557ea90e6048" containerName="tempest-tests-tempest-tests-runner" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.178557 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="98e331dd-24d4-4707-b432-557ea90e6048" containerName="tempest-tests-tempest-tests-runner" Oct 13 19:49:21 crc kubenswrapper[4974]: E1013 19:49:21.178576 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="734700c6-71f6-41f7-ad56-0a3f8ab61256" containerName="extract-utilities" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.178588 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="734700c6-71f6-41f7-ad56-0a3f8ab61256" containerName="extract-utilities" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.179018 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="98e331dd-24d4-4707-b432-557ea90e6048" containerName="tempest-tests-tempest-tests-runner" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.179053 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="734700c6-71f6-41f7-ad56-0a3f8ab61256" containerName="registry-server" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.180355 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.183280 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-fxpsb" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.189364 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.324239 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6f8e8b4-a194-4de6-b1c0-9b5b183136c5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.324377 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnrnl\" (UniqueName: \"kubernetes.io/projected/e6f8e8b4-a194-4de6-b1c0-9b5b183136c5-kube-api-access-lnrnl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6f8e8b4-a194-4de6-b1c0-9b5b183136c5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.426208 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6f8e8b4-a194-4de6-b1c0-9b5b183136c5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.426306 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnrnl\" (UniqueName: \"kubernetes.io/projected/e6f8e8b4-a194-4de6-b1c0-9b5b183136c5-kube-api-access-lnrnl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6f8e8b4-a194-4de6-b1c0-9b5b183136c5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.427020 4974 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6f8e8b4-a194-4de6-b1c0-9b5b183136c5\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.463028 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnrnl\" (UniqueName: \"kubernetes.io/projected/e6f8e8b4-a194-4de6-b1c0-9b5b183136c5-kube-api-access-lnrnl\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6f8e8b4-a194-4de6-b1c0-9b5b183136c5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.464241 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e6f8e8b4-a194-4de6-b1c0-9b5b183136c5\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.538672 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Oct 13 19:49:21 crc kubenswrapper[4974]: I1013 19:49:21.881246 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Oct 13 19:49:22 crc kubenswrapper[4974]: I1013 19:49:22.365102 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e6f8e8b4-a194-4de6-b1c0-9b5b183136c5","Type":"ContainerStarted","Data":"3f249076861a6e95411967c4a5637c3d29a6393fb9026707253093734be29283"} Oct 13 19:49:23 crc kubenswrapper[4974]: I1013 19:49:23.386547 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e6f8e8b4-a194-4de6-b1c0-9b5b183136c5","Type":"ContainerStarted","Data":"b465677793569203287096105763f37f21c6eada14a40efb131af5d074505e5a"} Oct 13 19:49:23 crc kubenswrapper[4974]: I1013 19:49:23.404461 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.539426466 podStartE2EDuration="2.404443606s" podCreationTimestamp="2025-10-13 19:49:21 +0000 UTC" firstStartedPulling="2025-10-13 19:49:21.89161447 +0000 UTC m=+5696.795980550" lastFinishedPulling="2025-10-13 19:49:22.7566316 +0000 UTC m=+5697.660997690" observedRunningTime="2025-10-13 19:49:23.401913895 +0000 UTC m=+5698.306279975" watchObservedRunningTime="2025-10-13 19:49:23.404443606 +0000 UTC m=+5698.308809686" Oct 13 19:49:41 crc kubenswrapper[4974]: I1013 19:49:41.927955 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6cfrc/must-gather-hn667"] Oct 13 19:49:41 crc kubenswrapper[4974]: I1013 19:49:41.930070 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/must-gather-hn667" Oct 13 19:49:41 crc kubenswrapper[4974]: I1013 19:49:41.933758 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6cfrc"/"openshift-service-ca.crt" Oct 13 19:49:41 crc kubenswrapper[4974]: I1013 19:49:41.934055 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6cfrc"/"kube-root-ca.crt" Oct 13 19:49:41 crc kubenswrapper[4974]: I1013 19:49:41.934330 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6cfrc"/"default-dockercfg-dqkdg" Oct 13 19:49:41 crc kubenswrapper[4974]: I1013 19:49:41.940278 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6cfrc/must-gather-hn667"] Oct 13 19:49:41 crc kubenswrapper[4974]: I1013 19:49:41.998864 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/253acb3b-63bf-4e3b-841a-44a99311d3b3-must-gather-output\") pod \"must-gather-hn667\" (UID: \"253acb3b-63bf-4e3b-841a-44a99311d3b3\") " pod="openshift-must-gather-6cfrc/must-gather-hn667" Oct 13 19:49:41 crc kubenswrapper[4974]: I1013 19:49:41.999506 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wf2p4\" (UniqueName: \"kubernetes.io/projected/253acb3b-63bf-4e3b-841a-44a99311d3b3-kube-api-access-wf2p4\") pod \"must-gather-hn667\" (UID: \"253acb3b-63bf-4e3b-841a-44a99311d3b3\") " pod="openshift-must-gather-6cfrc/must-gather-hn667" Oct 13 19:49:42 crc kubenswrapper[4974]: I1013 19:49:42.101816 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/253acb3b-63bf-4e3b-841a-44a99311d3b3-must-gather-output\") pod \"must-gather-hn667\" (UID: \"253acb3b-63bf-4e3b-841a-44a99311d3b3\") " pod="openshift-must-gather-6cfrc/must-gather-hn667" Oct 13 19:49:42 crc kubenswrapper[4974]: I1013 19:49:42.101937 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wf2p4\" (UniqueName: \"kubernetes.io/projected/253acb3b-63bf-4e3b-841a-44a99311d3b3-kube-api-access-wf2p4\") pod \"must-gather-hn667\" (UID: \"253acb3b-63bf-4e3b-841a-44a99311d3b3\") " pod="openshift-must-gather-6cfrc/must-gather-hn667" Oct 13 19:49:42 crc kubenswrapper[4974]: I1013 19:49:42.102479 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/253acb3b-63bf-4e3b-841a-44a99311d3b3-must-gather-output\") pod \"must-gather-hn667\" (UID: \"253acb3b-63bf-4e3b-841a-44a99311d3b3\") " pod="openshift-must-gather-6cfrc/must-gather-hn667" Oct 13 19:49:42 crc kubenswrapper[4974]: I1013 19:49:42.126210 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wf2p4\" (UniqueName: \"kubernetes.io/projected/253acb3b-63bf-4e3b-841a-44a99311d3b3-kube-api-access-wf2p4\") pod \"must-gather-hn667\" (UID: \"253acb3b-63bf-4e3b-841a-44a99311d3b3\") " pod="openshift-must-gather-6cfrc/must-gather-hn667" Oct 13 19:49:42 crc kubenswrapper[4974]: I1013 19:49:42.259897 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/must-gather-hn667" Oct 13 19:49:42 crc kubenswrapper[4974]: I1013 19:49:42.851766 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6cfrc/must-gather-hn667"] Oct 13 19:49:43 crc kubenswrapper[4974]: I1013 19:49:43.674880 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cfrc/must-gather-hn667" event={"ID":"253acb3b-63bf-4e3b-841a-44a99311d3b3","Type":"ContainerStarted","Data":"84c4ba007858078a6ecbc9ba4109c27ce2f3fd3cbc246550d562c72a52046574"} Oct 13 19:49:49 crc kubenswrapper[4974]: I1013 19:49:49.742131 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cfrc/must-gather-hn667" event={"ID":"253acb3b-63bf-4e3b-841a-44a99311d3b3","Type":"ContainerStarted","Data":"a23c8da3480cd24d0c1d95322e914e2f15b896b65e420cb6ac53ff6076fd2667"} Oct 13 19:49:49 crc kubenswrapper[4974]: I1013 19:49:49.742711 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cfrc/must-gather-hn667" event={"ID":"253acb3b-63bf-4e3b-841a-44a99311d3b3","Type":"ContainerStarted","Data":"45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055"} Oct 13 19:49:49 crc kubenswrapper[4974]: I1013 19:49:49.756707 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6cfrc/must-gather-hn667" podStartSLOduration=2.6361068149999998 podStartE2EDuration="8.756688121s" podCreationTimestamp="2025-10-13 19:49:41 +0000 UTC" firstStartedPulling="2025-10-13 19:49:42.831075953 +0000 UTC m=+5717.735442073" lastFinishedPulling="2025-10-13 19:49:48.951657309 +0000 UTC m=+5723.856023379" observedRunningTime="2025-10-13 19:49:49.754962303 +0000 UTC m=+5724.659328383" watchObservedRunningTime="2025-10-13 19:49:49.756688121 +0000 UTC m=+5724.661054211" Oct 13 19:49:52 crc kubenswrapper[4974]: E1013 19:49:52.037127 4974 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.30:55560->38.102.83.30:44205: write tcp 38.102.83.30:55560->38.102.83.30:44205: write: broken pipe Oct 13 19:49:53 crc kubenswrapper[4974]: I1013 19:49:53.314302 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6cfrc/crc-debug-94nlt"] Oct 13 19:49:53 crc kubenswrapper[4974]: I1013 19:49:53.316221 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/crc-debug-94nlt" Oct 13 19:49:53 crc kubenswrapper[4974]: I1013 19:49:53.460053 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02bac7df-77e3-4326-8258-3f7458c54d5e-host\") pod \"crc-debug-94nlt\" (UID: \"02bac7df-77e3-4326-8258-3f7458c54d5e\") " pod="openshift-must-gather-6cfrc/crc-debug-94nlt" Oct 13 19:49:53 crc kubenswrapper[4974]: I1013 19:49:53.460456 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c9pb\" (UniqueName: \"kubernetes.io/projected/02bac7df-77e3-4326-8258-3f7458c54d5e-kube-api-access-9c9pb\") pod \"crc-debug-94nlt\" (UID: \"02bac7df-77e3-4326-8258-3f7458c54d5e\") " pod="openshift-must-gather-6cfrc/crc-debug-94nlt" Oct 13 19:49:53 crc kubenswrapper[4974]: I1013 19:49:53.562223 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c9pb\" (UniqueName: \"kubernetes.io/projected/02bac7df-77e3-4326-8258-3f7458c54d5e-kube-api-access-9c9pb\") pod \"crc-debug-94nlt\" (UID: \"02bac7df-77e3-4326-8258-3f7458c54d5e\") " pod="openshift-must-gather-6cfrc/crc-debug-94nlt" Oct 13 19:49:53 crc kubenswrapper[4974]: I1013 19:49:53.562330 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02bac7df-77e3-4326-8258-3f7458c54d5e-host\") pod \"crc-debug-94nlt\" (UID: \"02bac7df-77e3-4326-8258-3f7458c54d5e\") " pod="openshift-must-gather-6cfrc/crc-debug-94nlt" Oct 13 19:49:53 crc kubenswrapper[4974]: I1013 19:49:53.562498 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02bac7df-77e3-4326-8258-3f7458c54d5e-host\") pod \"crc-debug-94nlt\" (UID: \"02bac7df-77e3-4326-8258-3f7458c54d5e\") " pod="openshift-must-gather-6cfrc/crc-debug-94nlt" Oct 13 19:49:53 crc kubenswrapper[4974]: I1013 19:49:53.587296 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c9pb\" (UniqueName: \"kubernetes.io/projected/02bac7df-77e3-4326-8258-3f7458c54d5e-kube-api-access-9c9pb\") pod \"crc-debug-94nlt\" (UID: \"02bac7df-77e3-4326-8258-3f7458c54d5e\") " pod="openshift-must-gather-6cfrc/crc-debug-94nlt" Oct 13 19:49:53 crc kubenswrapper[4974]: I1013 19:49:53.635447 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/crc-debug-94nlt" Oct 13 19:49:53 crc kubenswrapper[4974]: W1013 19:49:53.674693 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02bac7df_77e3_4326_8258_3f7458c54d5e.slice/crio-22873699ca4848209385da05f0a89f6e055ac7a838a53f3b1c20b70f64ace3a4 WatchSource:0}: Error finding container 22873699ca4848209385da05f0a89f6e055ac7a838a53f3b1c20b70f64ace3a4: Status 404 returned error can't find the container with id 22873699ca4848209385da05f0a89f6e055ac7a838a53f3b1c20b70f64ace3a4 Oct 13 19:49:53 crc kubenswrapper[4974]: I1013 19:49:53.784625 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cfrc/crc-debug-94nlt" event={"ID":"02bac7df-77e3-4326-8258-3f7458c54d5e","Type":"ContainerStarted","Data":"22873699ca4848209385da05f0a89f6e055ac7a838a53f3b1c20b70f64ace3a4"} Oct 13 19:50:03 crc kubenswrapper[4974]: I1013 19:50:03.891728 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cfrc/crc-debug-94nlt" event={"ID":"02bac7df-77e3-4326-8258-3f7458c54d5e","Type":"ContainerStarted","Data":"bde186223c0a2606a50e3054d5b68606c8512b44d5ec593648d89660a85c1546"} Oct 13 19:50:03 crc kubenswrapper[4974]: I1013 19:50:03.907053 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6cfrc/crc-debug-94nlt" podStartSLOduration=1.257038438 podStartE2EDuration="10.907035728s" podCreationTimestamp="2025-10-13 19:49:53 +0000 UTC" firstStartedPulling="2025-10-13 19:49:53.676822495 +0000 UTC m=+5728.581188575" lastFinishedPulling="2025-10-13 19:50:03.326819785 +0000 UTC m=+5738.231185865" observedRunningTime="2025-10-13 19:50:03.903116208 +0000 UTC m=+5738.807482288" watchObservedRunningTime="2025-10-13 19:50:03.907035728 +0000 UTC m=+5738.811401808" Oct 13 19:50:07 crc kubenswrapper[4974]: I1013 19:50:07.743097 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:50:07 crc kubenswrapper[4974]: I1013 19:50:07.743450 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:50:37 crc kubenswrapper[4974]: I1013 19:50:37.743439 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:50:37 crc kubenswrapper[4974]: I1013 19:50:37.746791 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:50:48 crc kubenswrapper[4974]: I1013 19:50:48.332846 4974 generic.go:334] "Generic (PLEG): container finished" podID="02bac7df-77e3-4326-8258-3f7458c54d5e" containerID="bde186223c0a2606a50e3054d5b68606c8512b44d5ec593648d89660a85c1546" exitCode=0 Oct 13 19:50:48 crc kubenswrapper[4974]: I1013 19:50:48.332981 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cfrc/crc-debug-94nlt" event={"ID":"02bac7df-77e3-4326-8258-3f7458c54d5e","Type":"ContainerDied","Data":"bde186223c0a2606a50e3054d5b68606c8512b44d5ec593648d89660a85c1546"} Oct 13 19:50:49 crc kubenswrapper[4974]: I1013 19:50:49.472307 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/crc-debug-94nlt" Oct 13 19:50:49 crc kubenswrapper[4974]: I1013 19:50:49.522297 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6cfrc/crc-debug-94nlt"] Oct 13 19:50:49 crc kubenswrapper[4974]: I1013 19:50:49.533415 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6cfrc/crc-debug-94nlt"] Oct 13 19:50:49 crc kubenswrapper[4974]: I1013 19:50:49.579233 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9c9pb\" (UniqueName: \"kubernetes.io/projected/02bac7df-77e3-4326-8258-3f7458c54d5e-kube-api-access-9c9pb\") pod \"02bac7df-77e3-4326-8258-3f7458c54d5e\" (UID: \"02bac7df-77e3-4326-8258-3f7458c54d5e\") " Oct 13 19:50:49 crc kubenswrapper[4974]: I1013 19:50:49.579322 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02bac7df-77e3-4326-8258-3f7458c54d5e-host\") pod \"02bac7df-77e3-4326-8258-3f7458c54d5e\" (UID: \"02bac7df-77e3-4326-8258-3f7458c54d5e\") " Oct 13 19:50:49 crc kubenswrapper[4974]: I1013 19:50:49.579484 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/02bac7df-77e3-4326-8258-3f7458c54d5e-host" (OuterVolumeSpecName: "host") pod "02bac7df-77e3-4326-8258-3f7458c54d5e" (UID: "02bac7df-77e3-4326-8258-3f7458c54d5e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 19:50:49 crc kubenswrapper[4974]: I1013 19:50:49.579998 4974 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/02bac7df-77e3-4326-8258-3f7458c54d5e-host\") on node \"crc\" DevicePath \"\"" Oct 13 19:50:49 crc kubenswrapper[4974]: I1013 19:50:49.597621 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02bac7df-77e3-4326-8258-3f7458c54d5e-kube-api-access-9c9pb" (OuterVolumeSpecName: "kube-api-access-9c9pb") pod "02bac7df-77e3-4326-8258-3f7458c54d5e" (UID: "02bac7df-77e3-4326-8258-3f7458c54d5e"). InnerVolumeSpecName "kube-api-access-9c9pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:50:49 crc kubenswrapper[4974]: I1013 19:50:49.682193 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9c9pb\" (UniqueName: \"kubernetes.io/projected/02bac7df-77e3-4326-8258-3f7458c54d5e-kube-api-access-9c9pb\") on node \"crc\" DevicePath \"\"" Oct 13 19:50:49 crc kubenswrapper[4974]: I1013 19:50:49.825957 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02bac7df-77e3-4326-8258-3f7458c54d5e" path="/var/lib/kubelet/pods/02bac7df-77e3-4326-8258-3f7458c54d5e/volumes" Oct 13 19:50:50 crc kubenswrapper[4974]: I1013 19:50:50.356327 4974 scope.go:117] "RemoveContainer" containerID="bde186223c0a2606a50e3054d5b68606c8512b44d5ec593648d89660a85c1546" Oct 13 19:50:50 crc kubenswrapper[4974]: I1013 19:50:50.356381 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/crc-debug-94nlt" Oct 13 19:50:50 crc kubenswrapper[4974]: I1013 19:50:50.721005 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6cfrc/crc-debug-jzpq7"] Oct 13 19:50:50 crc kubenswrapper[4974]: E1013 19:50:50.721881 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02bac7df-77e3-4326-8258-3f7458c54d5e" containerName="container-00" Oct 13 19:50:50 crc kubenswrapper[4974]: I1013 19:50:50.721899 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="02bac7df-77e3-4326-8258-3f7458c54d5e" containerName="container-00" Oct 13 19:50:50 crc kubenswrapper[4974]: I1013 19:50:50.722094 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="02bac7df-77e3-4326-8258-3f7458c54d5e" containerName="container-00" Oct 13 19:50:50 crc kubenswrapper[4974]: I1013 19:50:50.722862 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" Oct 13 19:50:50 crc kubenswrapper[4974]: I1013 19:50:50.804466 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe2eef48-9df3-43e8-864f-1669b8c6e432-host\") pod \"crc-debug-jzpq7\" (UID: \"fe2eef48-9df3-43e8-864f-1669b8c6e432\") " pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" Oct 13 19:50:50 crc kubenswrapper[4974]: I1013 19:50:50.804811 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-567wd\" (UniqueName: \"kubernetes.io/projected/fe2eef48-9df3-43e8-864f-1669b8c6e432-kube-api-access-567wd\") pod \"crc-debug-jzpq7\" (UID: \"fe2eef48-9df3-43e8-864f-1669b8c6e432\") " pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" Oct 13 19:50:50 crc kubenswrapper[4974]: I1013 19:50:50.906585 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-567wd\" (UniqueName: \"kubernetes.io/projected/fe2eef48-9df3-43e8-864f-1669b8c6e432-kube-api-access-567wd\") pod \"crc-debug-jzpq7\" (UID: \"fe2eef48-9df3-43e8-864f-1669b8c6e432\") " pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" Oct 13 19:50:50 crc kubenswrapper[4974]: I1013 19:50:50.906758 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe2eef48-9df3-43e8-864f-1669b8c6e432-host\") pod \"crc-debug-jzpq7\" (UID: \"fe2eef48-9df3-43e8-864f-1669b8c6e432\") " pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" Oct 13 19:50:50 crc kubenswrapper[4974]: I1013 19:50:50.906922 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe2eef48-9df3-43e8-864f-1669b8c6e432-host\") pod \"crc-debug-jzpq7\" (UID: \"fe2eef48-9df3-43e8-864f-1669b8c6e432\") " pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" Oct 13 19:50:50 crc kubenswrapper[4974]: I1013 19:50:50.924289 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-567wd\" (UniqueName: \"kubernetes.io/projected/fe2eef48-9df3-43e8-864f-1669b8c6e432-kube-api-access-567wd\") pod \"crc-debug-jzpq7\" (UID: \"fe2eef48-9df3-43e8-864f-1669b8c6e432\") " pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" Oct 13 19:50:51 crc kubenswrapper[4974]: I1013 19:50:51.049149 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" Oct 13 19:50:51 crc kubenswrapper[4974]: I1013 19:50:51.366235 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" event={"ID":"fe2eef48-9df3-43e8-864f-1669b8c6e432","Type":"ContainerStarted","Data":"e84c2effc5a8f5894c9c3de539b023f0ff31d420d38cb958028fbacc21add6ec"} Oct 13 19:50:51 crc kubenswrapper[4974]: I1013 19:50:51.366852 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" event={"ID":"fe2eef48-9df3-43e8-864f-1669b8c6e432","Type":"ContainerStarted","Data":"e3855703caeb34bac8276922d2226e6d9a9b9cc2c27d805d3794c86997f8d040"} Oct 13 19:50:51 crc kubenswrapper[4974]: I1013 19:50:51.390457 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" podStartSLOduration=1.390439503 podStartE2EDuration="1.390439503s" podCreationTimestamp="2025-10-13 19:50:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 19:50:51.379854955 +0000 UTC m=+5786.284221035" watchObservedRunningTime="2025-10-13 19:50:51.390439503 +0000 UTC m=+5786.294805593" Oct 13 19:50:52 crc kubenswrapper[4974]: I1013 19:50:52.379527 4974 generic.go:334] "Generic (PLEG): container finished" podID="fe2eef48-9df3-43e8-864f-1669b8c6e432" containerID="e84c2effc5a8f5894c9c3de539b023f0ff31d420d38cb958028fbacc21add6ec" exitCode=0 Oct 13 19:50:52 crc kubenswrapper[4974]: I1013 19:50:52.379608 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" event={"ID":"fe2eef48-9df3-43e8-864f-1669b8c6e432","Type":"ContainerDied","Data":"e84c2effc5a8f5894c9c3de539b023f0ff31d420d38cb958028fbacc21add6ec"} Oct 13 19:50:53 crc kubenswrapper[4974]: I1013 19:50:53.502640 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" Oct 13 19:50:53 crc kubenswrapper[4974]: I1013 19:50:53.554399 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6cfrc/crc-debug-jzpq7"] Oct 13 19:50:53 crc kubenswrapper[4974]: I1013 19:50:53.559009 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe2eef48-9df3-43e8-864f-1669b8c6e432-host\") pod \"fe2eef48-9df3-43e8-864f-1669b8c6e432\" (UID: \"fe2eef48-9df3-43e8-864f-1669b8c6e432\") " Oct 13 19:50:53 crc kubenswrapper[4974]: I1013 19:50:53.559077 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-567wd\" (UniqueName: \"kubernetes.io/projected/fe2eef48-9df3-43e8-864f-1669b8c6e432-kube-api-access-567wd\") pod \"fe2eef48-9df3-43e8-864f-1669b8c6e432\" (UID: \"fe2eef48-9df3-43e8-864f-1669b8c6e432\") " Oct 13 19:50:53 crc kubenswrapper[4974]: I1013 19:50:53.561268 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe2eef48-9df3-43e8-864f-1669b8c6e432-host" (OuterVolumeSpecName: "host") pod "fe2eef48-9df3-43e8-864f-1669b8c6e432" (UID: "fe2eef48-9df3-43e8-864f-1669b8c6e432"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 19:50:53 crc kubenswrapper[4974]: I1013 19:50:53.569774 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6cfrc/crc-debug-jzpq7"] Oct 13 19:50:53 crc kubenswrapper[4974]: I1013 19:50:53.587001 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe2eef48-9df3-43e8-864f-1669b8c6e432-kube-api-access-567wd" (OuterVolumeSpecName: "kube-api-access-567wd") pod "fe2eef48-9df3-43e8-864f-1669b8c6e432" (UID: "fe2eef48-9df3-43e8-864f-1669b8c6e432"). InnerVolumeSpecName "kube-api-access-567wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:50:53 crc kubenswrapper[4974]: I1013 19:50:53.662261 4974 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fe2eef48-9df3-43e8-864f-1669b8c6e432-host\") on node \"crc\" DevicePath \"\"" Oct 13 19:50:53 crc kubenswrapper[4974]: I1013 19:50:53.662316 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-567wd\" (UniqueName: \"kubernetes.io/projected/fe2eef48-9df3-43e8-864f-1669b8c6e432-kube-api-access-567wd\") on node \"crc\" DevicePath \"\"" Oct 13 19:50:53 crc kubenswrapper[4974]: I1013 19:50:53.825137 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe2eef48-9df3-43e8-864f-1669b8c6e432" path="/var/lib/kubelet/pods/fe2eef48-9df3-43e8-864f-1669b8c6e432/volumes" Oct 13 19:50:54 crc kubenswrapper[4974]: I1013 19:50:54.409455 4974 scope.go:117] "RemoveContainer" containerID="e84c2effc5a8f5894c9c3de539b023f0ff31d420d38cb958028fbacc21add6ec" Oct 13 19:50:54 crc kubenswrapper[4974]: I1013 19:50:54.409549 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/crc-debug-jzpq7" Oct 13 19:50:54 crc kubenswrapper[4974]: I1013 19:50:54.858761 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6cfrc/crc-debug-ck4wp"] Oct 13 19:50:54 crc kubenswrapper[4974]: E1013 19:50:54.859627 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe2eef48-9df3-43e8-864f-1669b8c6e432" containerName="container-00" Oct 13 19:50:54 crc kubenswrapper[4974]: I1013 19:50:54.859644 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe2eef48-9df3-43e8-864f-1669b8c6e432" containerName="container-00" Oct 13 19:50:54 crc kubenswrapper[4974]: I1013 19:50:54.859948 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe2eef48-9df3-43e8-864f-1669b8c6e432" containerName="container-00" Oct 13 19:50:54 crc kubenswrapper[4974]: I1013 19:50:54.860769 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/crc-debug-ck4wp" Oct 13 19:50:54 crc kubenswrapper[4974]: I1013 19:50:54.988883 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzscs\" (UniqueName: \"kubernetes.io/projected/b6f9a96f-ee13-4237-9e37-e65eb8320ec4-kube-api-access-rzscs\") pod \"crc-debug-ck4wp\" (UID: \"b6f9a96f-ee13-4237-9e37-e65eb8320ec4\") " pod="openshift-must-gather-6cfrc/crc-debug-ck4wp" Oct 13 19:50:54 crc kubenswrapper[4974]: I1013 19:50:54.989001 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6f9a96f-ee13-4237-9e37-e65eb8320ec4-host\") pod \"crc-debug-ck4wp\" (UID: \"b6f9a96f-ee13-4237-9e37-e65eb8320ec4\") " pod="openshift-must-gather-6cfrc/crc-debug-ck4wp" Oct 13 19:50:55 crc kubenswrapper[4974]: I1013 19:50:55.090567 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6f9a96f-ee13-4237-9e37-e65eb8320ec4-host\") pod \"crc-debug-ck4wp\" (UID: \"b6f9a96f-ee13-4237-9e37-e65eb8320ec4\") " pod="openshift-must-gather-6cfrc/crc-debug-ck4wp" Oct 13 19:50:55 crc kubenswrapper[4974]: I1013 19:50:55.090733 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6f9a96f-ee13-4237-9e37-e65eb8320ec4-host\") pod \"crc-debug-ck4wp\" (UID: \"b6f9a96f-ee13-4237-9e37-e65eb8320ec4\") " pod="openshift-must-gather-6cfrc/crc-debug-ck4wp" Oct 13 19:50:55 crc kubenswrapper[4974]: I1013 19:50:55.091038 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzscs\" (UniqueName: \"kubernetes.io/projected/b6f9a96f-ee13-4237-9e37-e65eb8320ec4-kube-api-access-rzscs\") pod \"crc-debug-ck4wp\" (UID: \"b6f9a96f-ee13-4237-9e37-e65eb8320ec4\") " pod="openshift-must-gather-6cfrc/crc-debug-ck4wp" Oct 13 19:50:55 crc kubenswrapper[4974]: I1013 19:50:55.110491 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzscs\" (UniqueName: \"kubernetes.io/projected/b6f9a96f-ee13-4237-9e37-e65eb8320ec4-kube-api-access-rzscs\") pod \"crc-debug-ck4wp\" (UID: \"b6f9a96f-ee13-4237-9e37-e65eb8320ec4\") " pod="openshift-must-gather-6cfrc/crc-debug-ck4wp" Oct 13 19:50:55 crc kubenswrapper[4974]: I1013 19:50:55.177997 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/crc-debug-ck4wp" Oct 13 19:50:55 crc kubenswrapper[4974]: I1013 19:50:55.420383 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cfrc/crc-debug-ck4wp" event={"ID":"b6f9a96f-ee13-4237-9e37-e65eb8320ec4","Type":"ContainerStarted","Data":"8c910f79dda938f66a3d70a06413129629a1e941b275a0670a0bf25fbbeda2c7"} Oct 13 19:50:56 crc kubenswrapper[4974]: I1013 19:50:56.435834 4974 generic.go:334] "Generic (PLEG): container finished" podID="b6f9a96f-ee13-4237-9e37-e65eb8320ec4" containerID="4998d1a92e409469112a130277fa9c23478944c321eda499d11884f838c4c604" exitCode=0 Oct 13 19:50:56 crc kubenswrapper[4974]: I1013 19:50:56.435894 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cfrc/crc-debug-ck4wp" event={"ID":"b6f9a96f-ee13-4237-9e37-e65eb8320ec4","Type":"ContainerDied","Data":"4998d1a92e409469112a130277fa9c23478944c321eda499d11884f838c4c604"} Oct 13 19:50:56 crc kubenswrapper[4974]: I1013 19:50:56.501060 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6cfrc/crc-debug-ck4wp"] Oct 13 19:50:56 crc kubenswrapper[4974]: I1013 19:50:56.514581 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6cfrc/crc-debug-ck4wp"] Oct 13 19:50:57 crc kubenswrapper[4974]: I1013 19:50:57.554017 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/crc-debug-ck4wp" Oct 13 19:50:57 crc kubenswrapper[4974]: I1013 19:50:57.647448 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzscs\" (UniqueName: \"kubernetes.io/projected/b6f9a96f-ee13-4237-9e37-e65eb8320ec4-kube-api-access-rzscs\") pod \"b6f9a96f-ee13-4237-9e37-e65eb8320ec4\" (UID: \"b6f9a96f-ee13-4237-9e37-e65eb8320ec4\") " Oct 13 19:50:57 crc kubenswrapper[4974]: I1013 19:50:57.647503 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6f9a96f-ee13-4237-9e37-e65eb8320ec4-host\") pod \"b6f9a96f-ee13-4237-9e37-e65eb8320ec4\" (UID: \"b6f9a96f-ee13-4237-9e37-e65eb8320ec4\") " Oct 13 19:50:57 crc kubenswrapper[4974]: I1013 19:50:57.647602 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6f9a96f-ee13-4237-9e37-e65eb8320ec4-host" (OuterVolumeSpecName: "host") pod "b6f9a96f-ee13-4237-9e37-e65eb8320ec4" (UID: "b6f9a96f-ee13-4237-9e37-e65eb8320ec4"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 19:50:57 crc kubenswrapper[4974]: I1013 19:50:57.648201 4974 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b6f9a96f-ee13-4237-9e37-e65eb8320ec4-host\") on node \"crc\" DevicePath \"\"" Oct 13 19:50:57 crc kubenswrapper[4974]: I1013 19:50:57.653211 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f9a96f-ee13-4237-9e37-e65eb8320ec4-kube-api-access-rzscs" (OuterVolumeSpecName: "kube-api-access-rzscs") pod "b6f9a96f-ee13-4237-9e37-e65eb8320ec4" (UID: "b6f9a96f-ee13-4237-9e37-e65eb8320ec4"). InnerVolumeSpecName "kube-api-access-rzscs". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:50:57 crc kubenswrapper[4974]: I1013 19:50:57.750746 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzscs\" (UniqueName: \"kubernetes.io/projected/b6f9a96f-ee13-4237-9e37-e65eb8320ec4-kube-api-access-rzscs\") on node \"crc\" DevicePath \"\"" Oct 13 19:50:57 crc kubenswrapper[4974]: I1013 19:50:57.822311 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f9a96f-ee13-4237-9e37-e65eb8320ec4" path="/var/lib/kubelet/pods/b6f9a96f-ee13-4237-9e37-e65eb8320ec4/volumes" Oct 13 19:50:58 crc kubenswrapper[4974]: I1013 19:50:58.452888 4974 scope.go:117] "RemoveContainer" containerID="4998d1a92e409469112a130277fa9c23478944c321eda499d11884f838c4c604" Oct 13 19:50:58 crc kubenswrapper[4974]: I1013 19:50:58.452897 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/crc-debug-ck4wp" Oct 13 19:51:07 crc kubenswrapper[4974]: I1013 19:51:07.743072 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:51:07 crc kubenswrapper[4974]: I1013 19:51:07.743793 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:51:07 crc kubenswrapper[4974]: I1013 19:51:07.743894 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 19:51:07 crc kubenswrapper[4974]: I1013 19:51:07.745293 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aacae48f28a8d16fcda75b3e9bb8a19abe359616918070df5b23b5f7d085a4ab"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 19:51:07 crc kubenswrapper[4974]: I1013 19:51:07.745512 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://aacae48f28a8d16fcda75b3e9bb8a19abe359616918070df5b23b5f7d085a4ab" gracePeriod=600 Oct 13 19:51:08 crc kubenswrapper[4974]: I1013 19:51:08.557069 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="aacae48f28a8d16fcda75b3e9bb8a19abe359616918070df5b23b5f7d085a4ab" exitCode=0 Oct 13 19:51:08 crc kubenswrapper[4974]: I1013 19:51:08.557142 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"aacae48f28a8d16fcda75b3e9bb8a19abe359616918070df5b23b5f7d085a4ab"} Oct 13 19:51:08 crc kubenswrapper[4974]: I1013 19:51:08.557428 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40"} Oct 13 19:51:08 crc kubenswrapper[4974]: I1013 19:51:08.557451 4974 scope.go:117] "RemoveContainer" containerID="4d3d32ca99409d9e3dfcd5cc21bd25af2306124b5772eb260619bedc766bbdbc" Oct 13 19:51:18 crc kubenswrapper[4974]: I1013 19:51:18.660610 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-cf595f5c8-dtfck_b67ca997-2edf-492b-ab80-f618c7201a29/barbican-api/0.log" Oct 13 19:51:18 crc kubenswrapper[4974]: I1013 19:51:18.860031 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-cf595f5c8-dtfck_b67ca997-2edf-492b-ab80-f618c7201a29/barbican-api-log/0.log" Oct 13 19:51:18 crc kubenswrapper[4974]: I1013 19:51:18.911419 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f69856764-9cjzr_45c18dbd-7083-463d-b845-f213bf6ae1ce/barbican-keystone-listener/0.log" Oct 13 19:51:19 crc kubenswrapper[4974]: I1013 19:51:19.005267 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f69856764-9cjzr_45c18dbd-7083-463d-b845-f213bf6ae1ce/barbican-keystone-listener-log/0.log" Oct 13 19:51:19 crc kubenswrapper[4974]: I1013 19:51:19.063480 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5b867446cf-7crxm_5acb3840-d265-46e7-8a2b-630f1bf38ec5/barbican-worker/0.log" Oct 13 19:51:19 crc kubenswrapper[4974]: I1013 19:51:19.143465 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5b867446cf-7crxm_5acb3840-d265-46e7-8a2b-630f1bf38ec5/barbican-worker-log/0.log" Oct 13 19:51:19 crc kubenswrapper[4974]: I1013 19:51:19.285867 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg_684a8cdf-df17-41a7-87b8-9027cb982025/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:19 crc kubenswrapper[4974]: I1013 19:51:19.453797 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cc0eb77b-ce25-42b6-a03f-600b090be522/ceilometer-central-agent/0.log" Oct 13 19:51:19 crc kubenswrapper[4974]: I1013 19:51:19.503546 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cc0eb77b-ce25-42b6-a03f-600b090be522/proxy-httpd/0.log" Oct 13 19:51:19 crc kubenswrapper[4974]: I1013 19:51:19.504182 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cc0eb77b-ce25-42b6-a03f-600b090be522/ceilometer-notification-agent/0.log" Oct 13 19:51:19 crc kubenswrapper[4974]: I1013 19:51:19.575099 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cc0eb77b-ce25-42b6-a03f-600b090be522/sg-core/0.log" Oct 13 19:51:19 crc kubenswrapper[4974]: I1013 19:51:19.847222 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a4d89238-6f74-4da7-aa6d-1b6c5f56a204/cinder-api-log/0.log" Oct 13 19:51:20 crc kubenswrapper[4974]: I1013 19:51:20.128482 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_57963e15-5fff-4158-ad83-0e4bd2ca1f7f/probe/0.log" Oct 13 19:51:20 crc kubenswrapper[4974]: I1013 19:51:20.428291 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8a1a4b28-15ae-4fa5-8741-2d34b1062eee/cinder-scheduler/0.log" Oct 13 19:51:20 crc kubenswrapper[4974]: I1013 19:51:20.469085 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8a1a4b28-15ae-4fa5-8741-2d34b1062eee/probe/0.log" Oct 13 19:51:20 crc kubenswrapper[4974]: I1013 19:51:20.690251 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a4d89238-6f74-4da7-aa6d-1b6c5f56a204/cinder-api/0.log" Oct 13 19:51:20 crc kubenswrapper[4974]: I1013 19:51:20.757914 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_57963e15-5fff-4158-ad83-0e4bd2ca1f7f/cinder-backup/0.log" Oct 13 19:51:20 crc kubenswrapper[4974]: I1013 19:51:20.861508 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e/probe/0.log" Oct 13 19:51:21 crc kubenswrapper[4974]: I1013 19:51:21.098112 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_688ac6ef-82eb-421b-9949-a832b8a73319/probe/0.log" Oct 13 19:51:21 crc kubenswrapper[4974]: I1013 19:51:21.158113 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e/cinder-volume/0.log" Oct 13 19:51:21 crc kubenswrapper[4974]: I1013 19:51:21.362137 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9_fedc6dd9-1f4c-43f4-9e0b-74292be529a6/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:21 crc kubenswrapper[4974]: I1013 19:51:21.444700 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_688ac6ef-82eb-421b-9949-a832b8a73319/cinder-volume/0.log" Oct 13 19:51:21 crc kubenswrapper[4974]: I1013 19:51:21.488706 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v_177f015a-482d-4058-a475-e6f787c7c1e5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:21 crc kubenswrapper[4974]: I1013 19:51:21.649953 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh_af61911c-89bc-4e8c-a327-6c1bab3c7d5d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:21 crc kubenswrapper[4974]: I1013 19:51:21.700986 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7d59b7cdcf-mbsgm_83dacb6d-48a4-400a-9edb-74a61b3bf83f/init/0.log" Oct 13 19:51:21 crc kubenswrapper[4974]: I1013 19:51:21.949167 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7d59b7cdcf-mbsgm_83dacb6d-48a4-400a-9edb-74a61b3bf83f/init/0.log" Oct 13 19:51:21 crc kubenswrapper[4974]: I1013 19:51:21.951261 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8lbms_dec36c0a-5335-4f2c-9582-ccd2c8f30207/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:22 crc kubenswrapper[4974]: I1013 19:51:22.087944 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7d59b7cdcf-mbsgm_83dacb6d-48a4-400a-9edb-74a61b3bf83f/dnsmasq-dns/0.log" Oct 13 19:51:22 crc kubenswrapper[4974]: I1013 19:51:22.142837 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c989ea63-17f9-4aca-a407-9e07cbb1a04c/glance-httpd/0.log" Oct 13 19:51:22 crc kubenswrapper[4974]: I1013 19:51:22.179718 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c989ea63-17f9-4aca-a407-9e07cbb1a04c/glance-log/0.log" Oct 13 19:51:22 crc kubenswrapper[4974]: I1013 19:51:22.303260 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7f9b8239-9b8d-4e59-8ba6-b7d8b5959248/glance-httpd/0.log" Oct 13 19:51:22 crc kubenswrapper[4974]: I1013 19:51:22.324169 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7f9b8239-9b8d-4e59-8ba6-b7d8b5959248/glance-log/0.log" Oct 13 19:51:22 crc kubenswrapper[4974]: I1013 19:51:22.473368 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6dcbf4cfcd-l89jc_003d2222-76eb-4a8c-b7c2-f201e88c542d/horizon/0.log" Oct 13 19:51:22 crc kubenswrapper[4974]: I1013 19:51:22.595448 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-wpc86_d272e4d0-84bf-4909-af41-81fe1f14bfcb/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:22 crc kubenswrapper[4974]: I1013 19:51:22.785698 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mwhx7_7e2087d1-027f-4fc7-8a75-5421f0e55868/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:23 crc kubenswrapper[4974]: I1013 19:51:23.007862 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29339701-wr52s_d52747d1-422c-40d4-ae78-d45dafcf9cbf/keystone-cron/0.log" Oct 13 19:51:23 crc kubenswrapper[4974]: I1013 19:51:23.165773 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8609b0d5-280b-498a-88da-2de3c7e27605/kube-state-metrics/0.log" Oct 13 19:51:23 crc kubenswrapper[4974]: I1013 19:51:23.390882 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-bpglt_dc0e7077-837e-4e51-a095-60eed2b94a51/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:23 crc kubenswrapper[4974]: I1013 19:51:23.477316 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6dcbf4cfcd-l89jc_003d2222-76eb-4a8c-b7c2-f201e88c542d/horizon-log/0.log" Oct 13 19:51:23 crc kubenswrapper[4974]: I1013 19:51:23.586588 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6f98cf4cc8-pgzsc_63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52/keystone-api/0.log" Oct 13 19:51:23 crc kubenswrapper[4974]: I1013 19:51:23.854323 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72_4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:23 crc kubenswrapper[4974]: I1013 19:51:23.941990 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8475fc656f-dnpll_59a33676-139f-4010-ab9a-25832163ab83/neutron-httpd/0.log" Oct 13 19:51:23 crc kubenswrapper[4974]: I1013 19:51:23.948129 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8475fc656f-dnpll_59a33676-139f-4010-ab9a-25832163ab83/neutron-api/0.log" Oct 13 19:51:24 crc kubenswrapper[4974]: I1013 19:51:24.949916 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_be594e70-2775-4c06-a266-b2fcaf428134/nova-cell0-conductor-conductor/0.log" Oct 13 19:51:25 crc kubenswrapper[4974]: I1013 19:51:25.069385 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_004c8db1-b15c-43d1-b988-92d779aaebb2/nova-cell1-conductor-conductor/0.log" Oct 13 19:51:25 crc kubenswrapper[4974]: I1013 19:51:25.422990 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_86eb0f8b-123d-4c2a-b7c6-d0a613625ee8/nova-cell1-novncproxy-novncproxy/0.log" Oct 13 19:51:25 crc kubenswrapper[4974]: I1013 19:51:25.704303 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_420e7e88-d552-4dd6-b5f8-b8ec9d8b9354/nova-api-log/0.log" Oct 13 19:51:25 crc kubenswrapper[4974]: I1013 19:51:25.763067 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-mj759_533bff4f-cd80-4893-95e8-404276a2e0d0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:25 crc kubenswrapper[4974]: I1013 19:51:25.948313 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_46af3042-50b4-462e-9449-4d521fd32afa/nova-metadata-log/0.log" Oct 13 19:51:26 crc kubenswrapper[4974]: I1013 19:51:26.002515 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_420e7e88-d552-4dd6-b5f8-b8ec9d8b9354/nova-api-api/0.log" Oct 13 19:51:26 crc kubenswrapper[4974]: I1013 19:51:26.040575 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_def34e48-c96a-4074-8780-44ba062e6816/memcached/0.log" Oct 13 19:51:26 crc kubenswrapper[4974]: I1013 19:51:26.907422 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0c7ef8b9-b24d-4ddf-b764-41cbd10095e8/mysql-bootstrap/0.log" Oct 13 19:51:27 crc kubenswrapper[4974]: I1013 19:51:27.076484 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a5915d47-3416-4678-8589-31ea94154b54/nova-scheduler-scheduler/0.log" Oct 13 19:51:27 crc kubenswrapper[4974]: I1013 19:51:27.111489 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0c7ef8b9-b24d-4ddf-b764-41cbd10095e8/mysql-bootstrap/0.log" Oct 13 19:51:27 crc kubenswrapper[4974]: I1013 19:51:27.164730 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0c7ef8b9-b24d-4ddf-b764-41cbd10095e8/galera/0.log" Oct 13 19:51:27 crc kubenswrapper[4974]: I1013 19:51:27.279427 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_72dd5704-4623-4186-8914-512c4ea61a5b/mysql-bootstrap/0.log" Oct 13 19:51:27 crc kubenswrapper[4974]: I1013 19:51:27.472888 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_72dd5704-4623-4186-8914-512c4ea61a5b/galera/0.log" Oct 13 19:51:27 crc kubenswrapper[4974]: I1013 19:51:27.493882 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_e9e672e2-a15c-4cfa-b751-c6208182f2c7/openstackclient/0.log" Oct 13 19:51:27 crc kubenswrapper[4974]: I1013 19:51:27.500593 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_72dd5704-4623-4186-8914-512c4ea61a5b/mysql-bootstrap/0.log" Oct 13 19:51:27 crc kubenswrapper[4974]: I1013 19:51:27.717000 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-tvbtp_2d4633f8-1f68-4d44-8569-69f02e1886f3/openstack-network-exporter/0.log" Oct 13 19:51:27 crc kubenswrapper[4974]: I1013 19:51:27.811366 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cxs58_d24d9e9c-90a1-490b-80d9-4d36d6050083/ovsdb-server-init/0.log" Oct 13 19:51:28 crc kubenswrapper[4974]: I1013 19:51:28.028149 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cxs58_d24d9e9c-90a1-490b-80d9-4d36d6050083/ovsdb-server/0.log" Oct 13 19:51:28 crc kubenswrapper[4974]: I1013 19:51:28.035038 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cxs58_d24d9e9c-90a1-490b-80d9-4d36d6050083/ovsdb-server-init/0.log" Oct 13 19:51:28 crc kubenswrapper[4974]: I1013 19:51:28.219854 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-r5pdv_c233290a-abb8-4429-8500-f4ec541ccc21/ovn-controller/0.log" Oct 13 19:51:28 crc kubenswrapper[4974]: I1013 19:51:28.370458 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cxs58_d24d9e9c-90a1-490b-80d9-4d36d6050083/ovs-vswitchd/0.log" Oct 13 19:51:28 crc kubenswrapper[4974]: I1013 19:51:28.383483 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-5wshh_a0a9ad72-5d41-4b79-8d65-797ed063b530/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:28 crc kubenswrapper[4974]: I1013 19:51:28.517620 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_97fd37f3-ee6b-4a37-a32b-057d21edf416/openstack-network-exporter/0.log" Oct 13 19:51:28 crc kubenswrapper[4974]: I1013 19:51:28.599508 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_97fd37f3-ee6b-4a37-a32b-057d21edf416/ovn-northd/0.log" Oct 13 19:51:28 crc kubenswrapper[4974]: I1013 19:51:28.602073 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_46af3042-50b4-462e-9449-4d521fd32afa/nova-metadata-metadata/0.log" Oct 13 19:51:28 crc kubenswrapper[4974]: I1013 19:51:28.657603 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eede053f-a3cf-4af6-9f1a-7458bec6f5a3/openstack-network-exporter/0.log" Oct 13 19:51:28 crc kubenswrapper[4974]: I1013 19:51:28.773102 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eede053f-a3cf-4af6-9f1a-7458bec6f5a3/ovsdbserver-nb/0.log" Oct 13 19:51:28 crc kubenswrapper[4974]: I1013 19:51:28.798428 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_63c7b046-cd5e-42e0-b295-ca90bb6a53c9/openstack-network-exporter/0.log" Oct 13 19:51:29 crc kubenswrapper[4974]: I1013 19:51:29.009832 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_63c7b046-cd5e-42e0-b295-ca90bb6a53c9/ovsdbserver-sb/0.log" Oct 13 19:51:29 crc kubenswrapper[4974]: I1013 19:51:29.273113 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6b66599996-gvfwf_c6c335f4-044a-4970-8a80-05755d65b00a/placement-api/0.log" Oct 13 19:51:29 crc kubenswrapper[4974]: I1013 19:51:29.291822 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_24cb3219-26af-43ab-95da-2320e69129db/init-config-reloader/0.log" Oct 13 19:51:29 crc kubenswrapper[4974]: I1013 19:51:29.315765 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6b66599996-gvfwf_c6c335f4-044a-4970-8a80-05755d65b00a/placement-log/0.log" Oct 13 19:51:29 crc kubenswrapper[4974]: I1013 19:51:29.416014 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_24cb3219-26af-43ab-95da-2320e69129db/config-reloader/0.log" Oct 13 19:51:29 crc kubenswrapper[4974]: I1013 19:51:29.416608 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_24cb3219-26af-43ab-95da-2320e69129db/init-config-reloader/0.log" Oct 13 19:51:29 crc kubenswrapper[4974]: I1013 19:51:29.458788 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_24cb3219-26af-43ab-95da-2320e69129db/prometheus/0.log" Oct 13 19:51:29 crc kubenswrapper[4974]: I1013 19:51:29.493533 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_24cb3219-26af-43ab-95da-2320e69129db/thanos-sidecar/0.log" Oct 13 19:51:29 crc kubenswrapper[4974]: I1013 19:51:29.646991 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4bf0b2fe-061e-486f-9e0f-96bd13bc7eae/setup-container/0.log" Oct 13 19:51:29 crc kubenswrapper[4974]: I1013 19:51:29.754090 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4bf0b2fe-061e-486f-9e0f-96bd13bc7eae/setup-container/0.log" Oct 13 19:51:29 crc kubenswrapper[4974]: I1013 19:51:29.780026 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4bf0b2fe-061e-486f-9e0f-96bd13bc7eae/rabbitmq/0.log" Oct 13 19:51:29 crc kubenswrapper[4974]: I1013 19:51:29.811933 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_a3edaa1a-d213-473f-963a-3bfea41226ec/setup-container/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.040424 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_a3edaa1a-d213-473f-963a-3bfea41226ec/rabbitmq/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.052750 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4b2e987b-fd90-420d-86f1-b9757dd40b03/setup-container/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.052826 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_a3edaa1a-d213-473f-963a-3bfea41226ec/setup-container/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.221463 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4b2e987b-fd90-420d-86f1-b9757dd40b03/rabbitmq/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.224937 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r_fc6631d2-7807-4296-8806-da8155c0992e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.265810 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4b2e987b-fd90-420d-86f1-b9757dd40b03/setup-container/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.432378 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fg6xw_4f6b26b6-df93-48fd-bbec-18aa5a371db8/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.451726 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx_3d236165-9044-430d-92cf-33e4eadd281f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.534020 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-cmccw_127007fe-96b5-4741-b207-af9ec05b68da/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.631728 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-k7bxc_37c5496c-447a-4806-81b1-15f11c4d057e/ssh-known-hosts-edpm-deployment/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.757537 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b5b857db9-xmtpd_835d1e2d-e4b2-47d5-89a2-ef955e650cc1/proxy-server/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.849706 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b5b857db9-xmtpd_835d1e2d-e4b2-47d5-89a2-ef955e650cc1/proxy-httpd/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.880462 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-7phmq_6aff3b1c-df57-4faf-9c6b-1009d5090a13/swift-ring-rebalance/0.log" Oct 13 19:51:30 crc kubenswrapper[4974]: I1013 19:51:30.960459 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/account-auditor/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.026398 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/account-reaper/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.061284 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/account-server/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.078754 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/account-replicator/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.119534 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/container-auditor/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.163922 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/container-replicator/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.248991 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/container-server/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.254414 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/container-updater/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.292811 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/object-auditor/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.306151 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/object-expirer/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.392511 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/object-replicator/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.427328 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/object-server/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.456440 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/rsync/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.480904 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/object-updater/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.525676 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/swift-recon-cron/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.655935 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr_a947ab95-2720-4cda-a618-470943b7443c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.681231 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_98e331dd-24d4-4707-b432-557ea90e6048/tempest-tests-tempest-tests-runner/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.877815 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t_f2eee5ad-fe26-46b2-af3c-1477c1513609/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:51:31 crc kubenswrapper[4974]: I1013 19:51:31.905474 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e6f8e8b4-a194-4de6-b1c0-9b5b183136c5/test-operator-logs-container/0.log" Oct 13 19:51:32 crc kubenswrapper[4974]: I1013 19:51:32.616913 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_dc2497ce-b7ed-481e-88c0-eb2e7aef34f9/watcher-applier/0.log" Oct 13 19:51:33 crc kubenswrapper[4974]: I1013 19:51:33.204417 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_5c387632-008a-4609-8b64-ff84c35596c7/watcher-api-log/0.log" Oct 13 19:51:35 crc kubenswrapper[4974]: I1013 19:51:35.432947 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_b6ea71d0-a795-4a73-9108-dc8e4a3e4187/watcher-decision-engine/0.log" Oct 13 19:51:36 crc kubenswrapper[4974]: I1013 19:51:36.303169 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_5c387632-008a-4609-8b64-ff84c35596c7/watcher-api/0.log" Oct 13 19:51:57 crc kubenswrapper[4974]: I1013 19:51:57.739728 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/util/0.log" Oct 13 19:51:57 crc kubenswrapper[4974]: I1013 19:51:57.911055 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/util/0.log" Oct 13 19:51:57 crc kubenswrapper[4974]: I1013 19:51:57.920328 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/pull/0.log" Oct 13 19:51:58 crc kubenswrapper[4974]: I1013 19:51:58.031484 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/pull/0.log" Oct 13 19:51:58 crc kubenswrapper[4974]: I1013 19:51:58.135199 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/extract/0.log" Oct 13 19:51:58 crc kubenswrapper[4974]: I1013 19:51:58.148532 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/util/0.log" Oct 13 19:51:58 crc kubenswrapper[4974]: I1013 19:51:58.184176 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/pull/0.log" Oct 13 19:51:58 crc kubenswrapper[4974]: I1013 19:51:58.317782 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-hvqbv_ef8af802-f6f6-4018-9bfd-f8aee92ff838/kube-rbac-proxy/0.log" Oct 13 19:51:58 crc kubenswrapper[4974]: I1013 19:51:58.421324 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-hvqbv_ef8af802-f6f6-4018-9bfd-f8aee92ff838/manager/0.log" Oct 13 19:51:58 crc kubenswrapper[4974]: I1013 19:51:58.484828 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-t2hfb_b44da60c-a4d1-406d-abb8-db29314b9e50/kube-rbac-proxy/0.log" Oct 13 19:51:58 crc kubenswrapper[4974]: I1013 19:51:58.570737 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-t2hfb_b44da60c-a4d1-406d-abb8-db29314b9e50/manager/0.log" Oct 13 19:51:58 crc kubenswrapper[4974]: I1013 19:51:58.620757 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-h2cmd_758864e5-2a90-496e-b006-dcfaf42c20bb/kube-rbac-proxy/0.log" Oct 13 19:51:58 crc kubenswrapper[4974]: I1013 19:51:58.757213 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-h2cmd_758864e5-2a90-496e-b006-dcfaf42c20bb/manager/0.log" Oct 13 19:51:58 crc kubenswrapper[4974]: I1013 19:51:58.835250 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-wzvwc_78805c21-d9b5-4f77-a318-fa1dfa26ebc3/kube-rbac-proxy/0.log" Oct 13 19:51:58 crc kubenswrapper[4974]: I1013 19:51:58.933922 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-wzvwc_78805c21-d9b5-4f77-a318-fa1dfa26ebc3/manager/0.log" Oct 13 19:51:59 crc kubenswrapper[4974]: I1013 19:51:59.008857 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-n4n2k_50ce5538-ff95-4983-8ff7-3a406b974617/kube-rbac-proxy/0.log" Oct 13 19:51:59 crc kubenswrapper[4974]: I1013 19:51:59.058077 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-n4n2k_50ce5538-ff95-4983-8ff7-3a406b974617/manager/0.log" Oct 13 19:51:59 crc kubenswrapper[4974]: I1013 19:51:59.136461 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-cp96l_f332d432-86f0-4c0b-80d6-dba6e2920a81/kube-rbac-proxy/0.log" Oct 13 19:51:59 crc kubenswrapper[4974]: I1013 19:51:59.229153 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-cp96l_f332d432-86f0-4c0b-80d6-dba6e2920a81/manager/0.log" Oct 13 19:51:59 crc kubenswrapper[4974]: I1013 19:51:59.316792 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-rzm52_e152664c-85e7-4854-8960-ee413a7eb3a3/kube-rbac-proxy/0.log" Oct 13 19:51:59 crc kubenswrapper[4974]: I1013 19:51:59.529623 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-w4lrj_e5d3e6f8-15bf-4544-b701-da591158af75/kube-rbac-proxy/0.log" Oct 13 19:51:59 crc kubenswrapper[4974]: I1013 19:51:59.531311 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-w4lrj_e5d3e6f8-15bf-4544-b701-da591158af75/manager/0.log" Oct 13 19:51:59 crc kubenswrapper[4974]: I1013 19:51:59.623098 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-rzm52_e152664c-85e7-4854-8960-ee413a7eb3a3/manager/0.log" Oct 13 19:51:59 crc kubenswrapper[4974]: I1013 19:51:59.794229 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-hvwzb_2b43f3c2-b280-40e9-9467-181a372011e1/kube-rbac-proxy/0.log" Oct 13 19:51:59 crc kubenswrapper[4974]: I1013 19:51:59.836101 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-hvwzb_2b43f3c2-b280-40e9-9467-181a372011e1/manager/0.log" Oct 13 19:51:59 crc kubenswrapper[4974]: I1013 19:51:59.943136 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-zkfrg_bcad591b-b126-4da8-a21c-636d710329b8/kube-rbac-proxy/0.log" Oct 13 19:52:00 crc kubenswrapper[4974]: I1013 19:52:00.007428 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-zkfrg_bcad591b-b126-4da8-a21c-636d710329b8/manager/0.log" Oct 13 19:52:00 crc kubenswrapper[4974]: I1013 19:52:00.119751 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-gq4sm_c10ae245-c899-4ea9-9edb-d62b176d19cc/kube-rbac-proxy/0.log" Oct 13 19:52:00 crc kubenswrapper[4974]: I1013 19:52:00.165055 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-gq4sm_c10ae245-c899-4ea9-9edb-d62b176d19cc/manager/0.log" Oct 13 19:52:00 crc kubenswrapper[4974]: I1013 19:52:00.258929 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-7p29r_86f89f48-3e17-4ed9-9cbb-6458223a1864/kube-rbac-proxy/0.log" Oct 13 19:52:00 crc kubenswrapper[4974]: I1013 19:52:00.350311 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-7p29r_86f89f48-3e17-4ed9-9cbb-6458223a1864/manager/0.log" Oct 13 19:52:00 crc kubenswrapper[4974]: I1013 19:52:00.440049 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-2xgsp_197d51a8-e30e-485c-8e76-bd4ee120da7b/kube-rbac-proxy/0.log" Oct 13 19:52:00 crc kubenswrapper[4974]: I1013 19:52:00.549262 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-2xgsp_197d51a8-e30e-485c-8e76-bd4ee120da7b/manager/0.log" Oct 13 19:52:00 crc kubenswrapper[4974]: I1013 19:52:00.626116 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-zx7qh_923ead90-d60a-431b-9630-693bdc007237/kube-rbac-proxy/0.log" Oct 13 19:52:00 crc kubenswrapper[4974]: I1013 19:52:00.683733 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-zx7qh_923ead90-d60a-431b-9630-693bdc007237/manager/0.log" Oct 13 19:52:00 crc kubenswrapper[4974]: I1013 19:52:00.805922 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w_d95330d6-c9ed-4fe6-8daa-6ef9495e72ae/manager/0.log" Oct 13 19:52:00 crc kubenswrapper[4974]: I1013 19:52:00.819878 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w_d95330d6-c9ed-4fe6-8daa-6ef9495e72ae/kube-rbac-proxy/0.log" Oct 13 19:52:00 crc kubenswrapper[4974]: I1013 19:52:00.964922 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7995b9c57f-x4jst_92f85149-41d6-471d-8d77-25fdafb20ca2/kube-rbac-proxy/0.log" Oct 13 19:52:01 crc kubenswrapper[4974]: I1013 19:52:01.108921 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8d8df4487-k7bh6_72ade784-9a52-4442-b3e6-044297f70cb7/kube-rbac-proxy/0.log" Oct 13 19:52:01 crc kubenswrapper[4974]: I1013 19:52:01.292102 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8d8df4487-k7bh6_72ade784-9a52-4442-b3e6-044297f70cb7/operator/0.log" Oct 13 19:52:01 crc kubenswrapper[4974]: I1013 19:52:01.370708 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zdv77_208151b9-4d45-4a71-9417-5082f935fd8b/registry-server/0.log" Oct 13 19:52:01 crc kubenswrapper[4974]: I1013 19:52:01.473517 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-8ftd2_f9ed9202-2a09-42d2-b140-8300e108e36a/kube-rbac-proxy/0.log" Oct 13 19:52:01 crc kubenswrapper[4974]: I1013 19:52:01.525407 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-8ftd2_f9ed9202-2a09-42d2-b140-8300e108e36a/manager/0.log" Oct 13 19:52:01 crc kubenswrapper[4974]: I1013 19:52:01.609124 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-rs4rf_4cbd873e-490d-4f1c-91cc-4ca45f109d7f/kube-rbac-proxy/0.log" Oct 13 19:52:01 crc kubenswrapper[4974]: I1013 19:52:01.680195 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-rs4rf_4cbd873e-490d-4f1c-91cc-4ca45f109d7f/manager/0.log" Oct 13 19:52:01 crc kubenswrapper[4974]: I1013 19:52:01.874645 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-scqj9_9a259044-9901-4a97-89f7-965118976af7/operator/0.log" Oct 13 19:52:01 crc kubenswrapper[4974]: I1013 19:52:01.978651 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7995b9c57f-x4jst_92f85149-41d6-471d-8d77-25fdafb20ca2/manager/0.log" Oct 13 19:52:02 crc kubenswrapper[4974]: I1013 19:52:02.134211 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-mj7kp_45bbc336-9feb-40e0-b7a9-92fad85e7396/kube-rbac-proxy/0.log" Oct 13 19:52:02 crc kubenswrapper[4974]: I1013 19:52:02.221033 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-mj7kp_45bbc336-9feb-40e0-b7a9-92fad85e7396/manager/0.log" Oct 13 19:52:02 crc kubenswrapper[4974]: I1013 19:52:02.283235 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-2q9c8_7269886e-6ad1-43fe-a8f2-c535dffe836c/kube-rbac-proxy/0.log" Oct 13 19:52:02 crc kubenswrapper[4974]: I1013 19:52:02.404063 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-2q9c8_7269886e-6ad1-43fe-a8f2-c535dffe836c/manager/0.log" Oct 13 19:52:02 crc kubenswrapper[4974]: I1013 19:52:02.432212 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-lb8ln_e6e02a94-3239-4e8b-8d87-4adb4ebcc98b/manager/0.log" Oct 13 19:52:02 crc kubenswrapper[4974]: I1013 19:52:02.464574 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-lb8ln_e6e02a94-3239-4e8b-8d87-4adb4ebcc98b/kube-rbac-proxy/0.log" Oct 13 19:52:02 crc kubenswrapper[4974]: I1013 19:52:02.651010 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6f64d8b78-d6wjd_7bb571d8-3894-46f5-a627-932b5dfdc2fd/kube-rbac-proxy/0.log" Oct 13 19:52:02 crc kubenswrapper[4974]: I1013 19:52:02.662704 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6f64d8b78-d6wjd_7bb571d8-3894-46f5-a627-932b5dfdc2fd/manager/0.log" Oct 13 19:52:19 crc kubenswrapper[4974]: I1013 19:52:19.629713 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h6bwk_78a93fc9-5305-44ea-a573-3e54bd52f22d/control-plane-machine-set-operator/0.log" Oct 13 19:52:19 crc kubenswrapper[4974]: I1013 19:52:19.822960 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-knlnk_bc5f4140-1f56-472e-95ed-cf3d4fb85f45/kube-rbac-proxy/0.log" Oct 13 19:52:19 crc kubenswrapper[4974]: I1013 19:52:19.851974 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-knlnk_bc5f4140-1f56-472e-95ed-cf3d4fb85f45/machine-api-operator/0.log" Oct 13 19:52:32 crc kubenswrapper[4974]: I1013 19:52:32.444780 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-79kfk_37764aab-fdaf-4d54-8afc-f2788411ff07/cert-manager-controller/0.log" Oct 13 19:52:32 crc kubenswrapper[4974]: I1013 19:52:32.613950 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-pbvz7_70f7d9f3-cbb4-4009-9feb-89a4eb2bbf95/cert-manager-cainjector/0.log" Oct 13 19:52:32 crc kubenswrapper[4974]: I1013 19:52:32.657090 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-lhmgd_88a02d7c-89e6-464c-b519-aeb3fe4dfda3/cert-manager-webhook/0.log" Oct 13 19:52:45 crc kubenswrapper[4974]: I1013 19:52:45.372271 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-rx8hz_4d9c8026-13ca-4df7-8bfc-d36594573e26/nmstate-console-plugin/0.log" Oct 13 19:52:45 crc kubenswrapper[4974]: I1013 19:52:45.540785 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-skb5t_51e7a340-7c32-4fae-b22f-2dd321f0afc1/nmstate-handler/0.log" Oct 13 19:52:45 crc kubenswrapper[4974]: I1013 19:52:45.603944 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-zg86t_16f35fcb-c349-4306-a4ee-306dfff9a8f1/kube-rbac-proxy/0.log" Oct 13 19:52:45 crc kubenswrapper[4974]: I1013 19:52:45.628949 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-zg86t_16f35fcb-c349-4306-a4ee-306dfff9a8f1/nmstate-metrics/0.log" Oct 13 19:52:45 crc kubenswrapper[4974]: I1013 19:52:45.774418 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-5cgww_b8f48940-751e-4e3c-98a3-d29c1b73e776/nmstate-operator/0.log" Oct 13 19:52:45 crc kubenswrapper[4974]: I1013 19:52:45.810212 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-rkbf2_ded07897-9b9c-4548-a909-02c623167912/nmstate-webhook/0.log" Oct 13 19:53:00 crc kubenswrapper[4974]: I1013 19:53:00.215358 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-2t9km_77a8d6d5-aa09-4168-8c4f-228849d999e2/kube-rbac-proxy/0.log" Oct 13 19:53:00 crc kubenswrapper[4974]: I1013 19:53:00.394257 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-frr-files/0.log" Oct 13 19:53:00 crc kubenswrapper[4974]: I1013 19:53:00.430189 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-2t9km_77a8d6d5-aa09-4168-8c4f-228849d999e2/controller/0.log" Oct 13 19:53:00 crc kubenswrapper[4974]: I1013 19:53:00.642180 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-frr-files/0.log" Oct 13 19:53:00 crc kubenswrapper[4974]: I1013 19:53:00.642235 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-metrics/0.log" Oct 13 19:53:00 crc kubenswrapper[4974]: I1013 19:53:00.643156 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-reloader/0.log" Oct 13 19:53:00 crc kubenswrapper[4974]: I1013 19:53:00.701137 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-reloader/0.log" Oct 13 19:53:00 crc kubenswrapper[4974]: I1013 19:53:00.820565 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-metrics/0.log" Oct 13 19:53:00 crc kubenswrapper[4974]: I1013 19:53:00.844084 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-frr-files/0.log" Oct 13 19:53:00 crc kubenswrapper[4974]: I1013 19:53:00.852779 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-reloader/0.log" Oct 13 19:53:00 crc kubenswrapper[4974]: I1013 19:53:00.875483 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-metrics/0.log" Oct 13 19:53:01 crc kubenswrapper[4974]: I1013 19:53:01.038008 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-reloader/0.log" Oct 13 19:53:01 crc kubenswrapper[4974]: I1013 19:53:01.043765 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-frr-files/0.log" Oct 13 19:53:01 crc kubenswrapper[4974]: I1013 19:53:01.045136 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-metrics/0.log" Oct 13 19:53:01 crc kubenswrapper[4974]: I1013 19:53:01.082624 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/controller/0.log" Oct 13 19:53:01 crc kubenswrapper[4974]: I1013 19:53:01.213394 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/kube-rbac-proxy/0.log" Oct 13 19:53:01 crc kubenswrapper[4974]: I1013 19:53:01.221756 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/frr-metrics/0.log" Oct 13 19:53:01 crc kubenswrapper[4974]: I1013 19:53:01.293147 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/kube-rbac-proxy-frr/0.log" Oct 13 19:53:01 crc kubenswrapper[4974]: I1013 19:53:01.417420 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/reloader/0.log" Oct 13 19:53:01 crc kubenswrapper[4974]: I1013 19:53:01.530540 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-t5brv_b014985a-51e5-494a-a16b-c126e6fce6b3/frr-k8s-webhook-server/0.log" Oct 13 19:53:01 crc kubenswrapper[4974]: I1013 19:53:01.651175 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-fd8c579fc-kgnkv_b51360f9-c2df-4940-8a9f-91bd9287605c/manager/0.log" Oct 13 19:53:01 crc kubenswrapper[4974]: I1013 19:53:01.879950 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-9c5d5965c-l9f5q_ca62cae8-3dc3-492d-aa06-59d085da2253/webhook-server/0.log" Oct 13 19:53:01 crc kubenswrapper[4974]: I1013 19:53:01.941999 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lc487_6d0e7abb-aa57-48af-9a9a-d3c626b9131a/kube-rbac-proxy/0.log" Oct 13 19:53:02 crc kubenswrapper[4974]: I1013 19:53:02.624151 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lc487_6d0e7abb-aa57-48af-9a9a-d3c626b9131a/speaker/0.log" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.012968 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/frr/0.log" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.124020 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9lndx"] Oct 13 19:53:03 crc kubenswrapper[4974]: E1013 19:53:03.124421 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f9a96f-ee13-4237-9e37-e65eb8320ec4" containerName="container-00" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.124437 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f9a96f-ee13-4237-9e37-e65eb8320ec4" containerName="container-00" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.124671 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f9a96f-ee13-4237-9e37-e65eb8320ec4" containerName="container-00" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.126035 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.142520 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9lndx"] Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.219412 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c211075a-896a-4f78-a2e9-7e4431546b9e-catalog-content\") pod \"redhat-operators-9lndx\" (UID: \"c211075a-896a-4f78-a2e9-7e4431546b9e\") " pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.219531 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c211075a-896a-4f78-a2e9-7e4431546b9e-utilities\") pod \"redhat-operators-9lndx\" (UID: \"c211075a-896a-4f78-a2e9-7e4431546b9e\") " pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.219604 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48zw5\" (UniqueName: \"kubernetes.io/projected/c211075a-896a-4f78-a2e9-7e4431546b9e-kube-api-access-48zw5\") pod \"redhat-operators-9lndx\" (UID: \"c211075a-896a-4f78-a2e9-7e4431546b9e\") " pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.320938 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c211075a-896a-4f78-a2e9-7e4431546b9e-catalog-content\") pod \"redhat-operators-9lndx\" (UID: \"c211075a-896a-4f78-a2e9-7e4431546b9e\") " pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.321070 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c211075a-896a-4f78-a2e9-7e4431546b9e-utilities\") pod \"redhat-operators-9lndx\" (UID: \"c211075a-896a-4f78-a2e9-7e4431546b9e\") " pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.321150 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48zw5\" (UniqueName: \"kubernetes.io/projected/c211075a-896a-4f78-a2e9-7e4431546b9e-kube-api-access-48zw5\") pod \"redhat-operators-9lndx\" (UID: \"c211075a-896a-4f78-a2e9-7e4431546b9e\") " pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.321396 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c211075a-896a-4f78-a2e9-7e4431546b9e-catalog-content\") pod \"redhat-operators-9lndx\" (UID: \"c211075a-896a-4f78-a2e9-7e4431546b9e\") " pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.321621 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c211075a-896a-4f78-a2e9-7e4431546b9e-utilities\") pod \"redhat-operators-9lndx\" (UID: \"c211075a-896a-4f78-a2e9-7e4431546b9e\") " pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.339224 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48zw5\" (UniqueName: \"kubernetes.io/projected/c211075a-896a-4f78-a2e9-7e4431546b9e-kube-api-access-48zw5\") pod \"redhat-operators-9lndx\" (UID: \"c211075a-896a-4f78-a2e9-7e4431546b9e\") " pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:03 crc kubenswrapper[4974]: I1013 19:53:03.462336 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:04 crc kubenswrapper[4974]: I1013 19:53:04.037380 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9lndx"] Oct 13 19:53:04 crc kubenswrapper[4974]: I1013 19:53:04.762218 4974 generic.go:334] "Generic (PLEG): container finished" podID="c211075a-896a-4f78-a2e9-7e4431546b9e" containerID="c0d9b5b03d769bcf3bc629bcf93629975a393ca7211e3e384c0c7f498ada00fa" exitCode=0 Oct 13 19:53:04 crc kubenswrapper[4974]: I1013 19:53:04.762296 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lndx" event={"ID":"c211075a-896a-4f78-a2e9-7e4431546b9e","Type":"ContainerDied","Data":"c0d9b5b03d769bcf3bc629bcf93629975a393ca7211e3e384c0c7f498ada00fa"} Oct 13 19:53:04 crc kubenswrapper[4974]: I1013 19:53:04.762904 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lndx" event={"ID":"c211075a-896a-4f78-a2e9-7e4431546b9e","Type":"ContainerStarted","Data":"390524db2db720f7c773ea4b515daf4619bcec5c4755628b2f563c4e6518ea43"} Oct 13 19:53:04 crc kubenswrapper[4974]: I1013 19:53:04.764718 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 19:53:06 crc kubenswrapper[4974]: I1013 19:53:06.781917 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lndx" event={"ID":"c211075a-896a-4f78-a2e9-7e4431546b9e","Type":"ContainerStarted","Data":"210f14f1f644d123ba51c403c22c2aa0eaa816788a7298793a8a959772092a47"} Oct 13 19:53:08 crc kubenswrapper[4974]: I1013 19:53:08.804163 4974 generic.go:334] "Generic (PLEG): container finished" podID="c211075a-896a-4f78-a2e9-7e4431546b9e" containerID="210f14f1f644d123ba51c403c22c2aa0eaa816788a7298793a8a959772092a47" exitCode=0 Oct 13 19:53:08 crc kubenswrapper[4974]: I1013 19:53:08.804499 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lndx" event={"ID":"c211075a-896a-4f78-a2e9-7e4431546b9e","Type":"ContainerDied","Data":"210f14f1f644d123ba51c403c22c2aa0eaa816788a7298793a8a959772092a47"} Oct 13 19:53:09 crc kubenswrapper[4974]: I1013 19:53:09.823555 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lndx" event={"ID":"c211075a-896a-4f78-a2e9-7e4431546b9e","Type":"ContainerStarted","Data":"5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02"} Oct 13 19:53:09 crc kubenswrapper[4974]: I1013 19:53:09.853420 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9lndx" podStartSLOduration=2.30328759 podStartE2EDuration="6.853400883s" podCreationTimestamp="2025-10-13 19:53:03 +0000 UTC" firstStartedPulling="2025-10-13 19:53:04.764314816 +0000 UTC m=+5919.668680906" lastFinishedPulling="2025-10-13 19:53:09.314428079 +0000 UTC m=+5924.218794199" observedRunningTime="2025-10-13 19:53:09.85116135 +0000 UTC m=+5924.755527450" watchObservedRunningTime="2025-10-13 19:53:09.853400883 +0000 UTC m=+5924.757766963" Oct 13 19:53:13 crc kubenswrapper[4974]: I1013 19:53:13.464050 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:13 crc kubenswrapper[4974]: I1013 19:53:13.464346 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:14 crc kubenswrapper[4974]: I1013 19:53:14.511225 4974 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9lndx" podUID="c211075a-896a-4f78-a2e9-7e4431546b9e" containerName="registry-server" probeResult="failure" output=< Oct 13 19:53:14 crc kubenswrapper[4974]: timeout: failed to connect service ":50051" within 1s Oct 13 19:53:14 crc kubenswrapper[4974]: > Oct 13 19:53:16 crc kubenswrapper[4974]: I1013 19:53:16.687104 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/util/0.log" Oct 13 19:53:16 crc kubenswrapper[4974]: I1013 19:53:16.993896 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/util/0.log" Oct 13 19:53:17 crc kubenswrapper[4974]: I1013 19:53:17.042840 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/pull/0.log" Oct 13 19:53:17 crc kubenswrapper[4974]: I1013 19:53:17.043182 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/pull/0.log" Oct 13 19:53:17 crc kubenswrapper[4974]: I1013 19:53:17.191829 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/util/0.log" Oct 13 19:53:17 crc kubenswrapper[4974]: I1013 19:53:17.224580 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/extract/0.log" Oct 13 19:53:17 crc kubenswrapper[4974]: I1013 19:53:17.250169 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/pull/0.log" Oct 13 19:53:17 crc kubenswrapper[4974]: I1013 19:53:17.401134 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/util/0.log" Oct 13 19:53:17 crc kubenswrapper[4974]: I1013 19:53:17.569951 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/util/0.log" Oct 13 19:53:17 crc kubenswrapper[4974]: I1013 19:53:17.586396 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/pull/0.log" Oct 13 19:53:17 crc kubenswrapper[4974]: I1013 19:53:17.588926 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/pull/0.log" Oct 13 19:53:17 crc kubenswrapper[4974]: I1013 19:53:17.773669 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/util/0.log" Oct 13 19:53:17 crc kubenswrapper[4974]: I1013 19:53:17.807463 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/extract/0.log" Oct 13 19:53:17 crc kubenswrapper[4974]: I1013 19:53:17.826247 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/pull/0.log" Oct 13 19:53:18 crc kubenswrapper[4974]: I1013 19:53:18.019616 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/extract-utilities/0.log" Oct 13 19:53:18 crc kubenswrapper[4974]: I1013 19:53:18.178066 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/extract-utilities/0.log" Oct 13 19:53:18 crc kubenswrapper[4974]: I1013 19:53:18.234390 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/extract-content/0.log" Oct 13 19:53:18 crc kubenswrapper[4974]: I1013 19:53:18.238719 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/extract-content/0.log" Oct 13 19:53:18 crc kubenswrapper[4974]: I1013 19:53:18.393321 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/extract-utilities/0.log" Oct 13 19:53:18 crc kubenswrapper[4974]: I1013 19:53:18.396150 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/extract-content/0.log" Oct 13 19:53:18 crc kubenswrapper[4974]: I1013 19:53:18.651679 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/extract-utilities/0.log" Oct 13 19:53:18 crc kubenswrapper[4974]: I1013 19:53:18.733259 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/registry-server/0.log" Oct 13 19:53:18 crc kubenswrapper[4974]: I1013 19:53:18.770713 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/extract-utilities/0.log" Oct 13 19:53:18 crc kubenswrapper[4974]: I1013 19:53:18.797443 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/extract-content/0.log" Oct 13 19:53:18 crc kubenswrapper[4974]: I1013 19:53:18.868467 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/extract-content/0.log" Oct 13 19:53:19 crc kubenswrapper[4974]: I1013 19:53:19.037641 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/extract-content/0.log" Oct 13 19:53:19 crc kubenswrapper[4974]: I1013 19:53:19.075514 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/extract-utilities/0.log" Oct 13 19:53:19 crc kubenswrapper[4974]: I1013 19:53:19.385400 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/util/0.log" Oct 13 19:53:19 crc kubenswrapper[4974]: I1013 19:53:19.560063 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/util/0.log" Oct 13 19:53:19 crc kubenswrapper[4974]: I1013 19:53:19.599630 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/pull/0.log" Oct 13 19:53:19 crc kubenswrapper[4974]: I1013 19:53:19.743939 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/pull/0.log" Oct 13 19:53:19 crc kubenswrapper[4974]: I1013 19:53:19.927715 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/extract/0.log" Oct 13 19:53:19 crc kubenswrapper[4974]: I1013 19:53:19.945469 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/pull/0.log" Oct 13 19:53:19 crc kubenswrapper[4974]: I1013 19:53:19.975578 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/util/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.081995 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/registry-server/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.141460 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jvrtp_09e0a416-6821-4853-8c22-d5e55e540657/marketplace-operator/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.320885 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/extract-utilities/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.448153 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/extract-utilities/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.500421 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/extract-content/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.507841 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/extract-content/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.640092 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/extract-utilities/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.684397 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/extract-content/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.741645 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/extract-utilities/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.864724 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/registry-server/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.916884 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/extract-utilities/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.928540 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/extract-content/0.log" Oct 13 19:53:20 crc kubenswrapper[4974]: I1013 19:53:20.964038 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/extract-content/0.log" Oct 13 19:53:21 crc kubenswrapper[4974]: I1013 19:53:21.084296 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/extract-content/0.log" Oct 13 19:53:21 crc kubenswrapper[4974]: I1013 19:53:21.105874 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/extract-utilities/0.log" Oct 13 19:53:21 crc kubenswrapper[4974]: I1013 19:53:21.176332 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9lndx_c211075a-896a-4f78-a2e9-7e4431546b9e/extract-utilities/0.log" Oct 13 19:53:21 crc kubenswrapper[4974]: I1013 19:53:21.373353 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9lndx_c211075a-896a-4f78-a2e9-7e4431546b9e/extract-content/0.log" Oct 13 19:53:21 crc kubenswrapper[4974]: I1013 19:53:21.419913 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9lndx_c211075a-896a-4f78-a2e9-7e4431546b9e/extract-utilities/0.log" Oct 13 19:53:21 crc kubenswrapper[4974]: I1013 19:53:21.446460 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9lndx_c211075a-896a-4f78-a2e9-7e4431546b9e/extract-content/0.log" Oct 13 19:53:21 crc kubenswrapper[4974]: I1013 19:53:21.612815 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9lndx_c211075a-896a-4f78-a2e9-7e4431546b9e/extract-content/0.log" Oct 13 19:53:21 crc kubenswrapper[4974]: I1013 19:53:21.646208 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9lndx_c211075a-896a-4f78-a2e9-7e4431546b9e/extract-utilities/0.log" Oct 13 19:53:21 crc kubenswrapper[4974]: I1013 19:53:21.679574 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9lndx_c211075a-896a-4f78-a2e9-7e4431546b9e/registry-server/0.log" Oct 13 19:53:21 crc kubenswrapper[4974]: I1013 19:53:21.729308 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/registry-server/0.log" Oct 13 19:53:23 crc kubenswrapper[4974]: I1013 19:53:23.537370 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:23 crc kubenswrapper[4974]: I1013 19:53:23.615540 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:23 crc kubenswrapper[4974]: I1013 19:53:23.779864 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9lndx"] Oct 13 19:53:24 crc kubenswrapper[4974]: I1013 19:53:24.956419 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9lndx" podUID="c211075a-896a-4f78-a2e9-7e4431546b9e" containerName="registry-server" containerID="cri-o://5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02" gracePeriod=2 Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.476299 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.590102 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48zw5\" (UniqueName: \"kubernetes.io/projected/c211075a-896a-4f78-a2e9-7e4431546b9e-kube-api-access-48zw5\") pod \"c211075a-896a-4f78-a2e9-7e4431546b9e\" (UID: \"c211075a-896a-4f78-a2e9-7e4431546b9e\") " Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.590191 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c211075a-896a-4f78-a2e9-7e4431546b9e-utilities\") pod \"c211075a-896a-4f78-a2e9-7e4431546b9e\" (UID: \"c211075a-896a-4f78-a2e9-7e4431546b9e\") " Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.590350 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c211075a-896a-4f78-a2e9-7e4431546b9e-catalog-content\") pod \"c211075a-896a-4f78-a2e9-7e4431546b9e\" (UID: \"c211075a-896a-4f78-a2e9-7e4431546b9e\") " Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.591869 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c211075a-896a-4f78-a2e9-7e4431546b9e-utilities" (OuterVolumeSpecName: "utilities") pod "c211075a-896a-4f78-a2e9-7e4431546b9e" (UID: "c211075a-896a-4f78-a2e9-7e4431546b9e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.601971 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c211075a-896a-4f78-a2e9-7e4431546b9e-kube-api-access-48zw5" (OuterVolumeSpecName: "kube-api-access-48zw5") pod "c211075a-896a-4f78-a2e9-7e4431546b9e" (UID: "c211075a-896a-4f78-a2e9-7e4431546b9e"). InnerVolumeSpecName "kube-api-access-48zw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.672489 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c211075a-896a-4f78-a2e9-7e4431546b9e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c211075a-896a-4f78-a2e9-7e4431546b9e" (UID: "c211075a-896a-4f78-a2e9-7e4431546b9e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.692990 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48zw5\" (UniqueName: \"kubernetes.io/projected/c211075a-896a-4f78-a2e9-7e4431546b9e-kube-api-access-48zw5\") on node \"crc\" DevicePath \"\"" Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.693032 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c211075a-896a-4f78-a2e9-7e4431546b9e-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.693041 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c211075a-896a-4f78-a2e9-7e4431546b9e-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.970052 4974 generic.go:334] "Generic (PLEG): container finished" podID="c211075a-896a-4f78-a2e9-7e4431546b9e" containerID="5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02" exitCode=0 Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.970116 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lndx" event={"ID":"c211075a-896a-4f78-a2e9-7e4431546b9e","Type":"ContainerDied","Data":"5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02"} Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.970128 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9lndx" Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.970156 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9lndx" event={"ID":"c211075a-896a-4f78-a2e9-7e4431546b9e","Type":"ContainerDied","Data":"390524db2db720f7c773ea4b515daf4619bcec5c4755628b2f563c4e6518ea43"} Oct 13 19:53:25 crc kubenswrapper[4974]: I1013 19:53:25.970179 4974 scope.go:117] "RemoveContainer" containerID="5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02" Oct 13 19:53:26 crc kubenswrapper[4974]: I1013 19:53:26.004790 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9lndx"] Oct 13 19:53:26 crc kubenswrapper[4974]: I1013 19:53:26.006481 4974 scope.go:117] "RemoveContainer" containerID="210f14f1f644d123ba51c403c22c2aa0eaa816788a7298793a8a959772092a47" Oct 13 19:53:26 crc kubenswrapper[4974]: I1013 19:53:26.015317 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9lndx"] Oct 13 19:53:26 crc kubenswrapper[4974]: I1013 19:53:26.042063 4974 scope.go:117] "RemoveContainer" containerID="c0d9b5b03d769bcf3bc629bcf93629975a393ca7211e3e384c0c7f498ada00fa" Oct 13 19:53:26 crc kubenswrapper[4974]: I1013 19:53:26.114175 4974 scope.go:117] "RemoveContainer" containerID="5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02" Oct 13 19:53:26 crc kubenswrapper[4974]: E1013 19:53:26.115611 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02\": container with ID starting with 5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02 not found: ID does not exist" containerID="5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02" Oct 13 19:53:26 crc kubenswrapper[4974]: I1013 19:53:26.115742 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02"} err="failed to get container status \"5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02\": rpc error: code = NotFound desc = could not find container \"5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02\": container with ID starting with 5447562007c541757dcc54b81308b3283974b7c28b98203e0df799986d83fb02 not found: ID does not exist" Oct 13 19:53:26 crc kubenswrapper[4974]: I1013 19:53:26.115791 4974 scope.go:117] "RemoveContainer" containerID="210f14f1f644d123ba51c403c22c2aa0eaa816788a7298793a8a959772092a47" Oct 13 19:53:26 crc kubenswrapper[4974]: E1013 19:53:26.116513 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"210f14f1f644d123ba51c403c22c2aa0eaa816788a7298793a8a959772092a47\": container with ID starting with 210f14f1f644d123ba51c403c22c2aa0eaa816788a7298793a8a959772092a47 not found: ID does not exist" containerID="210f14f1f644d123ba51c403c22c2aa0eaa816788a7298793a8a959772092a47" Oct 13 19:53:26 crc kubenswrapper[4974]: I1013 19:53:26.116581 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"210f14f1f644d123ba51c403c22c2aa0eaa816788a7298793a8a959772092a47"} err="failed to get container status \"210f14f1f644d123ba51c403c22c2aa0eaa816788a7298793a8a959772092a47\": rpc error: code = NotFound desc = could not find container \"210f14f1f644d123ba51c403c22c2aa0eaa816788a7298793a8a959772092a47\": container with ID starting with 210f14f1f644d123ba51c403c22c2aa0eaa816788a7298793a8a959772092a47 not found: ID does not exist" Oct 13 19:53:26 crc kubenswrapper[4974]: I1013 19:53:26.116767 4974 scope.go:117] "RemoveContainer" containerID="c0d9b5b03d769bcf3bc629bcf93629975a393ca7211e3e384c0c7f498ada00fa" Oct 13 19:53:26 crc kubenswrapper[4974]: E1013 19:53:26.117402 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d9b5b03d769bcf3bc629bcf93629975a393ca7211e3e384c0c7f498ada00fa\": container with ID starting with c0d9b5b03d769bcf3bc629bcf93629975a393ca7211e3e384c0c7f498ada00fa not found: ID does not exist" containerID="c0d9b5b03d769bcf3bc629bcf93629975a393ca7211e3e384c0c7f498ada00fa" Oct 13 19:53:26 crc kubenswrapper[4974]: I1013 19:53:26.117452 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d9b5b03d769bcf3bc629bcf93629975a393ca7211e3e384c0c7f498ada00fa"} err="failed to get container status \"c0d9b5b03d769bcf3bc629bcf93629975a393ca7211e3e384c0c7f498ada00fa\": rpc error: code = NotFound desc = could not find container \"c0d9b5b03d769bcf3bc629bcf93629975a393ca7211e3e384c0c7f498ada00fa\": container with ID starting with c0d9b5b03d769bcf3bc629bcf93629975a393ca7211e3e384c0c7f498ada00fa not found: ID does not exist" Oct 13 19:53:27 crc kubenswrapper[4974]: I1013 19:53:27.828885 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c211075a-896a-4f78-a2e9-7e4431546b9e" path="/var/lib/kubelet/pods/c211075a-896a-4f78-a2e9-7e4431546b9e/volumes" Oct 13 19:53:34 crc kubenswrapper[4974]: I1013 19:53:34.624686 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-qc6nj_bf1f0f19-a1c6-4d16-8876-a70c018e0452/prometheus-operator/0.log" Oct 13 19:53:34 crc kubenswrapper[4974]: I1013 19:53:34.807960 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67d85487b6-5hftg_33d95548-42f2-4bde-88eb-23cfd6a5c5c0/prometheus-operator-admission-webhook/0.log" Oct 13 19:53:34 crc kubenswrapper[4974]: I1013 19:53:34.849900 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt_cffd65cb-eb29-48d8-b634-4e535b39ce51/prometheus-operator-admission-webhook/0.log" Oct 13 19:53:34 crc kubenswrapper[4974]: I1013 19:53:34.975167 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-zqdvp_adeddb78-abbb-494b-b723-d3ed7a66503f/operator/0.log" Oct 13 19:53:35 crc kubenswrapper[4974]: I1013 19:53:35.047514 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-8mc8w_bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471/perses-operator/0.log" Oct 13 19:53:37 crc kubenswrapper[4974]: I1013 19:53:37.743479 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:53:37 crc kubenswrapper[4974]: I1013 19:53:37.744165 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:54:07 crc kubenswrapper[4974]: I1013 19:54:07.743377 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:54:07 crc kubenswrapper[4974]: I1013 19:54:07.744010 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:54:17 crc kubenswrapper[4974]: E1013 19:54:17.966549 4974 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/system.slice/rpm-ostreed.service\": RecentStats: unable to find data in memory cache]" Oct 13 19:54:37 crc kubenswrapper[4974]: I1013 19:54:37.743139 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 19:54:37 crc kubenswrapper[4974]: I1013 19:54:37.743881 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 19:54:37 crc kubenswrapper[4974]: I1013 19:54:37.744098 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 19:54:37 crc kubenswrapper[4974]: I1013 19:54:37.745216 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 19:54:37 crc kubenswrapper[4974]: I1013 19:54:37.745315 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" gracePeriod=600 Oct 13 19:54:37 crc kubenswrapper[4974]: E1013 19:54:37.876971 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:54:38 crc kubenswrapper[4974]: I1013 19:54:38.826875 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" exitCode=0 Oct 13 19:54:38 crc kubenswrapper[4974]: I1013 19:54:38.826941 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40"} Oct 13 19:54:38 crc kubenswrapper[4974]: I1013 19:54:38.827288 4974 scope.go:117] "RemoveContainer" containerID="aacae48f28a8d16fcda75b3e9bb8a19abe359616918070df5b23b5f7d085a4ab" Oct 13 19:54:38 crc kubenswrapper[4974]: I1013 19:54:38.828479 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:54:38 crc kubenswrapper[4974]: E1013 19:54:38.828972 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:54:49 crc kubenswrapper[4974]: I1013 19:54:49.812306 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:54:49 crc kubenswrapper[4974]: E1013 19:54:49.813302 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:55:00 crc kubenswrapper[4974]: I1013 19:55:00.811749 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:55:00 crc kubenswrapper[4974]: E1013 19:55:00.813999 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:55:11 crc kubenswrapper[4974]: I1013 19:55:11.812453 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:55:11 crc kubenswrapper[4974]: E1013 19:55:11.814334 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:55:23 crc kubenswrapper[4974]: I1013 19:55:23.816545 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:55:23 crc kubenswrapper[4974]: E1013 19:55:23.817850 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:55:26 crc kubenswrapper[4974]: I1013 19:55:26.408708 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6cfrc/must-gather-hn667" event={"ID":"253acb3b-63bf-4e3b-841a-44a99311d3b3","Type":"ContainerDied","Data":"45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055"} Oct 13 19:55:26 crc kubenswrapper[4974]: I1013 19:55:26.408756 4974 generic.go:334] "Generic (PLEG): container finished" podID="253acb3b-63bf-4e3b-841a-44a99311d3b3" containerID="45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055" exitCode=0 Oct 13 19:55:26 crc kubenswrapper[4974]: I1013 19:55:26.410009 4974 scope.go:117] "RemoveContainer" containerID="45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055" Oct 13 19:55:26 crc kubenswrapper[4974]: I1013 19:55:26.592612 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6cfrc_must-gather-hn667_253acb3b-63bf-4e3b-841a-44a99311d3b3/gather/0.log" Oct 13 19:55:34 crc kubenswrapper[4974]: I1013 19:55:34.748091 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6cfrc/must-gather-hn667"] Oct 13 19:55:34 crc kubenswrapper[4974]: I1013 19:55:34.749072 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6cfrc/must-gather-hn667" podUID="253acb3b-63bf-4e3b-841a-44a99311d3b3" containerName="copy" containerID="cri-o://a23c8da3480cd24d0c1d95322e914e2f15b896b65e420cb6ac53ff6076fd2667" gracePeriod=2 Oct 13 19:55:34 crc kubenswrapper[4974]: I1013 19:55:34.755812 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6cfrc/must-gather-hn667"] Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.174505 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6cfrc_must-gather-hn667_253acb3b-63bf-4e3b-841a-44a99311d3b3/copy/0.log" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.175403 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/must-gather-hn667" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.271554 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wf2p4\" (UniqueName: \"kubernetes.io/projected/253acb3b-63bf-4e3b-841a-44a99311d3b3-kube-api-access-wf2p4\") pod \"253acb3b-63bf-4e3b-841a-44a99311d3b3\" (UID: \"253acb3b-63bf-4e3b-841a-44a99311d3b3\") " Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.271708 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/253acb3b-63bf-4e3b-841a-44a99311d3b3-must-gather-output\") pod \"253acb3b-63bf-4e3b-841a-44a99311d3b3\" (UID: \"253acb3b-63bf-4e3b-841a-44a99311d3b3\") " Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.277573 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/253acb3b-63bf-4e3b-841a-44a99311d3b3-kube-api-access-wf2p4" (OuterVolumeSpecName: "kube-api-access-wf2p4") pod "253acb3b-63bf-4e3b-841a-44a99311d3b3" (UID: "253acb3b-63bf-4e3b-841a-44a99311d3b3"). InnerVolumeSpecName "kube-api-access-wf2p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.373833 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wf2p4\" (UniqueName: \"kubernetes.io/projected/253acb3b-63bf-4e3b-841a-44a99311d3b3-kube-api-access-wf2p4\") on node \"crc\" DevicePath \"\"" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.489808 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253acb3b-63bf-4e3b-841a-44a99311d3b3-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "253acb3b-63bf-4e3b-841a-44a99311d3b3" (UID: "253acb3b-63bf-4e3b-841a-44a99311d3b3"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.540256 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6cfrc_must-gather-hn667_253acb3b-63bf-4e3b-841a-44a99311d3b3/copy/0.log" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.540621 4974 generic.go:334] "Generic (PLEG): container finished" podID="253acb3b-63bf-4e3b-841a-44a99311d3b3" containerID="a23c8da3480cd24d0c1d95322e914e2f15b896b65e420cb6ac53ff6076fd2667" exitCode=143 Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.540698 4974 scope.go:117] "RemoveContainer" containerID="a23c8da3480cd24d0c1d95322e914e2f15b896b65e420cb6ac53ff6076fd2667" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.540869 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6cfrc/must-gather-hn667" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.570388 4974 scope.go:117] "RemoveContainer" containerID="45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.576493 4974 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/253acb3b-63bf-4e3b-841a-44a99311d3b3-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.659804 4974 scope.go:117] "RemoveContainer" containerID="a23c8da3480cd24d0c1d95322e914e2f15b896b65e420cb6ac53ff6076fd2667" Oct 13 19:55:35 crc kubenswrapper[4974]: E1013 19:55:35.660233 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a23c8da3480cd24d0c1d95322e914e2f15b896b65e420cb6ac53ff6076fd2667\": container with ID starting with a23c8da3480cd24d0c1d95322e914e2f15b896b65e420cb6ac53ff6076fd2667 not found: ID does not exist" containerID="a23c8da3480cd24d0c1d95322e914e2f15b896b65e420cb6ac53ff6076fd2667" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.660271 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a23c8da3480cd24d0c1d95322e914e2f15b896b65e420cb6ac53ff6076fd2667"} err="failed to get container status \"a23c8da3480cd24d0c1d95322e914e2f15b896b65e420cb6ac53ff6076fd2667\": rpc error: code = NotFound desc = could not find container \"a23c8da3480cd24d0c1d95322e914e2f15b896b65e420cb6ac53ff6076fd2667\": container with ID starting with a23c8da3480cd24d0c1d95322e914e2f15b896b65e420cb6ac53ff6076fd2667 not found: ID does not exist" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.660296 4974 scope.go:117] "RemoveContainer" containerID="45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055" Oct 13 19:55:35 crc kubenswrapper[4974]: E1013 19:55:35.661001 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055\": container with ID starting with 45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055 not found: ID does not exist" containerID="45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.661031 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055"} err="failed to get container status \"45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055\": rpc error: code = NotFound desc = could not find container \"45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055\": container with ID starting with 45bdac72edfc62e99ddf74294fa25d58253f2661b2f0e2d731491aeae61e0055 not found: ID does not exist" Oct 13 19:55:35 crc kubenswrapper[4974]: I1013 19:55:35.829484 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="253acb3b-63bf-4e3b-841a-44a99311d3b3" path="/var/lib/kubelet/pods/253acb3b-63bf-4e3b-841a-44a99311d3b3/volumes" Oct 13 19:55:38 crc kubenswrapper[4974]: I1013 19:55:38.812457 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:55:38 crc kubenswrapper[4974]: E1013 19:55:38.813148 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:55:51 crc kubenswrapper[4974]: I1013 19:55:51.812007 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:55:51 crc kubenswrapper[4974]: E1013 19:55:51.813392 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:56:04 crc kubenswrapper[4974]: I1013 19:56:04.811559 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:56:04 crc kubenswrapper[4974]: E1013 19:56:04.812557 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:56:19 crc kubenswrapper[4974]: I1013 19:56:19.811557 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:56:19 crc kubenswrapper[4974]: E1013 19:56:19.812452 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.219721 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9wdjx/must-gather-knsvx"] Oct 13 19:56:21 crc kubenswrapper[4974]: E1013 19:56:21.221442 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253acb3b-63bf-4e3b-841a-44a99311d3b3" containerName="gather" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.221529 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="253acb3b-63bf-4e3b-841a-44a99311d3b3" containerName="gather" Oct 13 19:56:21 crc kubenswrapper[4974]: E1013 19:56:21.221605 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c211075a-896a-4f78-a2e9-7e4431546b9e" containerName="registry-server" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.221709 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c211075a-896a-4f78-a2e9-7e4431546b9e" containerName="registry-server" Oct 13 19:56:21 crc kubenswrapper[4974]: E1013 19:56:21.221806 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c211075a-896a-4f78-a2e9-7e4431546b9e" containerName="extract-content" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.221869 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c211075a-896a-4f78-a2e9-7e4431546b9e" containerName="extract-content" Oct 13 19:56:21 crc kubenswrapper[4974]: E1013 19:56:21.221926 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253acb3b-63bf-4e3b-841a-44a99311d3b3" containerName="copy" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.221998 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="253acb3b-63bf-4e3b-841a-44a99311d3b3" containerName="copy" Oct 13 19:56:21 crc kubenswrapper[4974]: E1013 19:56:21.222064 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c211075a-896a-4f78-a2e9-7e4431546b9e" containerName="extract-utilities" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.222124 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c211075a-896a-4f78-a2e9-7e4431546b9e" containerName="extract-utilities" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.222409 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="253acb3b-63bf-4e3b-841a-44a99311d3b3" containerName="gather" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.222477 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="253acb3b-63bf-4e3b-841a-44a99311d3b3" containerName="copy" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.222552 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="c211075a-896a-4f78-a2e9-7e4431546b9e" containerName="registry-server" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.223901 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/must-gather-knsvx" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.244203 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9wdjx"/"openshift-service-ca.crt" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.244419 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-9wdjx"/"kube-root-ca.crt" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.244762 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-9wdjx"/"default-dockercfg-gp7k6" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.258792 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9wdjx/must-gather-knsvx"] Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.299966 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c30b33c-8e8d-4907-8d74-c3809c6ebeda-must-gather-output\") pod \"must-gather-knsvx\" (UID: \"4c30b33c-8e8d-4907-8d74-c3809c6ebeda\") " pod="openshift-must-gather-9wdjx/must-gather-knsvx" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.300047 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmbhh\" (UniqueName: \"kubernetes.io/projected/4c30b33c-8e8d-4907-8d74-c3809c6ebeda-kube-api-access-jmbhh\") pod \"must-gather-knsvx\" (UID: \"4c30b33c-8e8d-4907-8d74-c3809c6ebeda\") " pod="openshift-must-gather-9wdjx/must-gather-knsvx" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.402294 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c30b33c-8e8d-4907-8d74-c3809c6ebeda-must-gather-output\") pod \"must-gather-knsvx\" (UID: \"4c30b33c-8e8d-4907-8d74-c3809c6ebeda\") " pod="openshift-must-gather-9wdjx/must-gather-knsvx" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.402965 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c30b33c-8e8d-4907-8d74-c3809c6ebeda-must-gather-output\") pod \"must-gather-knsvx\" (UID: \"4c30b33c-8e8d-4907-8d74-c3809c6ebeda\") " pod="openshift-must-gather-9wdjx/must-gather-knsvx" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.403203 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmbhh\" (UniqueName: \"kubernetes.io/projected/4c30b33c-8e8d-4907-8d74-c3809c6ebeda-kube-api-access-jmbhh\") pod \"must-gather-knsvx\" (UID: \"4c30b33c-8e8d-4907-8d74-c3809c6ebeda\") " pod="openshift-must-gather-9wdjx/must-gather-knsvx" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.429286 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmbhh\" (UniqueName: \"kubernetes.io/projected/4c30b33c-8e8d-4907-8d74-c3809c6ebeda-kube-api-access-jmbhh\") pod \"must-gather-knsvx\" (UID: \"4c30b33c-8e8d-4907-8d74-c3809c6ebeda\") " pod="openshift-must-gather-9wdjx/must-gather-knsvx" Oct 13 19:56:21 crc kubenswrapper[4974]: I1013 19:56:21.595095 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/must-gather-knsvx" Oct 13 19:56:22 crc kubenswrapper[4974]: I1013 19:56:22.073983 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-9wdjx/must-gather-knsvx"] Oct 13 19:56:22 crc kubenswrapper[4974]: I1013 19:56:22.164398 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/must-gather-knsvx" event={"ID":"4c30b33c-8e8d-4907-8d74-c3809c6ebeda","Type":"ContainerStarted","Data":"c2f4d8ec3aad0ee7ec041fe3b7e958e52054b4e62e0f3ff3369f904040a734fa"} Oct 13 19:56:23 crc kubenswrapper[4974]: I1013 19:56:23.177135 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/must-gather-knsvx" event={"ID":"4c30b33c-8e8d-4907-8d74-c3809c6ebeda","Type":"ContainerStarted","Data":"20bc1ea13b3d12d01a68eb843955f930aa491ea4ca1fc1bc6c1ad1c80b36a832"} Oct 13 19:56:23 crc kubenswrapper[4974]: I1013 19:56:23.177757 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/must-gather-knsvx" event={"ID":"4c30b33c-8e8d-4907-8d74-c3809c6ebeda","Type":"ContainerStarted","Data":"d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f"} Oct 13 19:56:23 crc kubenswrapper[4974]: I1013 19:56:23.210541 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9wdjx/must-gather-knsvx" podStartSLOduration=2.2105213360000002 podStartE2EDuration="2.210521336s" podCreationTimestamp="2025-10-13 19:56:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 19:56:23.203283482 +0000 UTC m=+6118.107649602" watchObservedRunningTime="2025-10-13 19:56:23.210521336 +0000 UTC m=+6118.114887426" Oct 13 19:56:26 crc kubenswrapper[4974]: I1013 19:56:26.648522 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9wdjx/crc-debug-x9fwf"] Oct 13 19:56:26 crc kubenswrapper[4974]: I1013 19:56:26.650313 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" Oct 13 19:56:26 crc kubenswrapper[4974]: I1013 19:56:26.732086 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12e6ddc2-c4cf-4450-b11b-00472af8dd81-host\") pod \"crc-debug-x9fwf\" (UID: \"12e6ddc2-c4cf-4450-b11b-00472af8dd81\") " pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" Oct 13 19:56:26 crc kubenswrapper[4974]: I1013 19:56:26.732196 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d28z\" (UniqueName: \"kubernetes.io/projected/12e6ddc2-c4cf-4450-b11b-00472af8dd81-kube-api-access-9d28z\") pod \"crc-debug-x9fwf\" (UID: \"12e6ddc2-c4cf-4450-b11b-00472af8dd81\") " pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" Oct 13 19:56:26 crc kubenswrapper[4974]: I1013 19:56:26.833938 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12e6ddc2-c4cf-4450-b11b-00472af8dd81-host\") pod \"crc-debug-x9fwf\" (UID: \"12e6ddc2-c4cf-4450-b11b-00472af8dd81\") " pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" Oct 13 19:56:26 crc kubenswrapper[4974]: I1013 19:56:26.834050 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d28z\" (UniqueName: \"kubernetes.io/projected/12e6ddc2-c4cf-4450-b11b-00472af8dd81-kube-api-access-9d28z\") pod \"crc-debug-x9fwf\" (UID: \"12e6ddc2-c4cf-4450-b11b-00472af8dd81\") " pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" Oct 13 19:56:26 crc kubenswrapper[4974]: I1013 19:56:26.834096 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12e6ddc2-c4cf-4450-b11b-00472af8dd81-host\") pod \"crc-debug-x9fwf\" (UID: \"12e6ddc2-c4cf-4450-b11b-00472af8dd81\") " pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" Oct 13 19:56:26 crc kubenswrapper[4974]: I1013 19:56:26.852562 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d28z\" (UniqueName: \"kubernetes.io/projected/12e6ddc2-c4cf-4450-b11b-00472af8dd81-kube-api-access-9d28z\") pod \"crc-debug-x9fwf\" (UID: \"12e6ddc2-c4cf-4450-b11b-00472af8dd81\") " pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" Oct 13 19:56:26 crc kubenswrapper[4974]: I1013 19:56:26.968809 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" Oct 13 19:56:26 crc kubenswrapper[4974]: W1013 19:56:26.997476 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12e6ddc2_c4cf_4450_b11b_00472af8dd81.slice/crio-288c89a767c74a51b47620f193b01fb1a18d82af7f2a25d93dbddc83094e9175 WatchSource:0}: Error finding container 288c89a767c74a51b47620f193b01fb1a18d82af7f2a25d93dbddc83094e9175: Status 404 returned error can't find the container with id 288c89a767c74a51b47620f193b01fb1a18d82af7f2a25d93dbddc83094e9175 Oct 13 19:56:27 crc kubenswrapper[4974]: I1013 19:56:27.212543 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" event={"ID":"12e6ddc2-c4cf-4450-b11b-00472af8dd81","Type":"ContainerStarted","Data":"288c89a767c74a51b47620f193b01fb1a18d82af7f2a25d93dbddc83094e9175"} Oct 13 19:56:28 crc kubenswrapper[4974]: I1013 19:56:28.223601 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" event={"ID":"12e6ddc2-c4cf-4450-b11b-00472af8dd81","Type":"ContainerStarted","Data":"a468f0852551094ac26b775e3582efbc8a57446b0f244c7c1a04d5c4c9c813c0"} Oct 13 19:56:28 crc kubenswrapper[4974]: I1013 19:56:28.242552 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" podStartSLOduration=2.242524265 podStartE2EDuration="2.242524265s" podCreationTimestamp="2025-10-13 19:56:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 19:56:28.235203299 +0000 UTC m=+6123.139569389" watchObservedRunningTime="2025-10-13 19:56:28.242524265 +0000 UTC m=+6123.146890385" Oct 13 19:56:34 crc kubenswrapper[4974]: I1013 19:56:34.811959 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:56:34 crc kubenswrapper[4974]: E1013 19:56:34.812873 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:56:47 crc kubenswrapper[4974]: I1013 19:56:47.815976 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:56:47 crc kubenswrapper[4974]: E1013 19:56:47.816642 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:56:59 crc kubenswrapper[4974]: I1013 19:56:59.812029 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:56:59 crc kubenswrapper[4974]: E1013 19:56:59.812771 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:57:07 crc kubenswrapper[4974]: I1013 19:57:07.614963 4974 generic.go:334] "Generic (PLEG): container finished" podID="12e6ddc2-c4cf-4450-b11b-00472af8dd81" containerID="a468f0852551094ac26b775e3582efbc8a57446b0f244c7c1a04d5c4c9c813c0" exitCode=0 Oct 13 19:57:07 crc kubenswrapper[4974]: I1013 19:57:07.615137 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" event={"ID":"12e6ddc2-c4cf-4450-b11b-00472af8dd81","Type":"ContainerDied","Data":"a468f0852551094ac26b775e3582efbc8a57446b0f244c7c1a04d5c4c9c813c0"} Oct 13 19:57:08 crc kubenswrapper[4974]: I1013 19:57:08.760107 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" Oct 13 19:57:08 crc kubenswrapper[4974]: I1013 19:57:08.789416 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9wdjx/crc-debug-x9fwf"] Oct 13 19:57:08 crc kubenswrapper[4974]: I1013 19:57:08.797754 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9wdjx/crc-debug-x9fwf"] Oct 13 19:57:08 crc kubenswrapper[4974]: I1013 19:57:08.826332 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d28z\" (UniqueName: \"kubernetes.io/projected/12e6ddc2-c4cf-4450-b11b-00472af8dd81-kube-api-access-9d28z\") pod \"12e6ddc2-c4cf-4450-b11b-00472af8dd81\" (UID: \"12e6ddc2-c4cf-4450-b11b-00472af8dd81\") " Oct 13 19:57:08 crc kubenswrapper[4974]: I1013 19:57:08.826431 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12e6ddc2-c4cf-4450-b11b-00472af8dd81-host\") pod \"12e6ddc2-c4cf-4450-b11b-00472af8dd81\" (UID: \"12e6ddc2-c4cf-4450-b11b-00472af8dd81\") " Oct 13 19:57:08 crc kubenswrapper[4974]: I1013 19:57:08.826914 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/12e6ddc2-c4cf-4450-b11b-00472af8dd81-host" (OuterVolumeSpecName: "host") pod "12e6ddc2-c4cf-4450-b11b-00472af8dd81" (UID: "12e6ddc2-c4cf-4450-b11b-00472af8dd81"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 19:57:08 crc kubenswrapper[4974]: I1013 19:57:08.831950 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e6ddc2-c4cf-4450-b11b-00472af8dd81-kube-api-access-9d28z" (OuterVolumeSpecName: "kube-api-access-9d28z") pod "12e6ddc2-c4cf-4450-b11b-00472af8dd81" (UID: "12e6ddc2-c4cf-4450-b11b-00472af8dd81"). InnerVolumeSpecName "kube-api-access-9d28z". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:57:08 crc kubenswrapper[4974]: I1013 19:57:08.930709 4974 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/12e6ddc2-c4cf-4450-b11b-00472af8dd81-host\") on node \"crc\" DevicePath \"\"" Oct 13 19:57:08 crc kubenswrapper[4974]: I1013 19:57:08.931154 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d28z\" (UniqueName: \"kubernetes.io/projected/12e6ddc2-c4cf-4450-b11b-00472af8dd81-kube-api-access-9d28z\") on node \"crc\" DevicePath \"\"" Oct 13 19:57:09 crc kubenswrapper[4974]: I1013 19:57:09.639329 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="288c89a767c74a51b47620f193b01fb1a18d82af7f2a25d93dbddc83094e9175" Oct 13 19:57:09 crc kubenswrapper[4974]: I1013 19:57:09.639436 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/crc-debug-x9fwf" Oct 13 19:57:09 crc kubenswrapper[4974]: I1013 19:57:09.824264 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e6ddc2-c4cf-4450-b11b-00472af8dd81" path="/var/lib/kubelet/pods/12e6ddc2-c4cf-4450-b11b-00472af8dd81/volumes" Oct 13 19:57:10 crc kubenswrapper[4974]: I1013 19:57:10.088941 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9wdjx/crc-debug-2sqzm"] Oct 13 19:57:10 crc kubenswrapper[4974]: E1013 19:57:10.089841 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e6ddc2-c4cf-4450-b11b-00472af8dd81" containerName="container-00" Oct 13 19:57:10 crc kubenswrapper[4974]: I1013 19:57:10.089870 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e6ddc2-c4cf-4450-b11b-00472af8dd81" containerName="container-00" Oct 13 19:57:10 crc kubenswrapper[4974]: I1013 19:57:10.097541 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e6ddc2-c4cf-4450-b11b-00472af8dd81" containerName="container-00" Oct 13 19:57:10 crc kubenswrapper[4974]: I1013 19:57:10.099691 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" Oct 13 19:57:10 crc kubenswrapper[4974]: I1013 19:57:10.255903 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d22a6dc-319e-4b16-a8c8-486731e75394-host\") pod \"crc-debug-2sqzm\" (UID: \"6d22a6dc-319e-4b16-a8c8-486731e75394\") " pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" Oct 13 19:57:10 crc kubenswrapper[4974]: I1013 19:57:10.256055 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qffsr\" (UniqueName: \"kubernetes.io/projected/6d22a6dc-319e-4b16-a8c8-486731e75394-kube-api-access-qffsr\") pod \"crc-debug-2sqzm\" (UID: \"6d22a6dc-319e-4b16-a8c8-486731e75394\") " pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" Oct 13 19:57:10 crc kubenswrapper[4974]: I1013 19:57:10.357919 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d22a6dc-319e-4b16-a8c8-486731e75394-host\") pod \"crc-debug-2sqzm\" (UID: \"6d22a6dc-319e-4b16-a8c8-486731e75394\") " pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" Oct 13 19:57:10 crc kubenswrapper[4974]: I1013 19:57:10.358037 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d22a6dc-319e-4b16-a8c8-486731e75394-host\") pod \"crc-debug-2sqzm\" (UID: \"6d22a6dc-319e-4b16-a8c8-486731e75394\") " pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" Oct 13 19:57:10 crc kubenswrapper[4974]: I1013 19:57:10.358037 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qffsr\" (UniqueName: \"kubernetes.io/projected/6d22a6dc-319e-4b16-a8c8-486731e75394-kube-api-access-qffsr\") pod \"crc-debug-2sqzm\" (UID: \"6d22a6dc-319e-4b16-a8c8-486731e75394\") " pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" Oct 13 19:57:10 crc kubenswrapper[4974]: I1013 19:57:10.376451 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qffsr\" (UniqueName: \"kubernetes.io/projected/6d22a6dc-319e-4b16-a8c8-486731e75394-kube-api-access-qffsr\") pod \"crc-debug-2sqzm\" (UID: \"6d22a6dc-319e-4b16-a8c8-486731e75394\") " pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" Oct 13 19:57:10 crc kubenswrapper[4974]: I1013 19:57:10.420875 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" Oct 13 19:57:10 crc kubenswrapper[4974]: W1013 19:57:10.451397 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6d22a6dc_319e_4b16_a8c8_486731e75394.slice/crio-0731194b56f79a252fec7296d44092a4b41de3846740e6705bc95c6e140766cf WatchSource:0}: Error finding container 0731194b56f79a252fec7296d44092a4b41de3846740e6705bc95c6e140766cf: Status 404 returned error can't find the container with id 0731194b56f79a252fec7296d44092a4b41de3846740e6705bc95c6e140766cf Oct 13 19:57:10 crc kubenswrapper[4974]: I1013 19:57:10.648953 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" event={"ID":"6d22a6dc-319e-4b16-a8c8-486731e75394","Type":"ContainerStarted","Data":"0731194b56f79a252fec7296d44092a4b41de3846740e6705bc95c6e140766cf"} Oct 13 19:57:11 crc kubenswrapper[4974]: I1013 19:57:11.659785 4974 generic.go:334] "Generic (PLEG): container finished" podID="6d22a6dc-319e-4b16-a8c8-486731e75394" containerID="0c42ee3ab294d92bc36f3f61176dc56791b918b3898a70fd2a7ed3131fa4868b" exitCode=0 Oct 13 19:57:11 crc kubenswrapper[4974]: I1013 19:57:11.659837 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" event={"ID":"6d22a6dc-319e-4b16-a8c8-486731e75394","Type":"ContainerDied","Data":"0c42ee3ab294d92bc36f3f61176dc56791b918b3898a70fd2a7ed3131fa4868b"} Oct 13 19:57:12 crc kubenswrapper[4974]: I1013 19:57:12.798166 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" Oct 13 19:57:12 crc kubenswrapper[4974]: I1013 19:57:12.903152 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qffsr\" (UniqueName: \"kubernetes.io/projected/6d22a6dc-319e-4b16-a8c8-486731e75394-kube-api-access-qffsr\") pod \"6d22a6dc-319e-4b16-a8c8-486731e75394\" (UID: \"6d22a6dc-319e-4b16-a8c8-486731e75394\") " Oct 13 19:57:12 crc kubenswrapper[4974]: I1013 19:57:12.903259 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d22a6dc-319e-4b16-a8c8-486731e75394-host\") pod \"6d22a6dc-319e-4b16-a8c8-486731e75394\" (UID: \"6d22a6dc-319e-4b16-a8c8-486731e75394\") " Oct 13 19:57:12 crc kubenswrapper[4974]: I1013 19:57:12.903340 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d22a6dc-319e-4b16-a8c8-486731e75394-host" (OuterVolumeSpecName: "host") pod "6d22a6dc-319e-4b16-a8c8-486731e75394" (UID: "6d22a6dc-319e-4b16-a8c8-486731e75394"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 19:57:12 crc kubenswrapper[4974]: I1013 19:57:12.904443 4974 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6d22a6dc-319e-4b16-a8c8-486731e75394-host\") on node \"crc\" DevicePath \"\"" Oct 13 19:57:12 crc kubenswrapper[4974]: I1013 19:57:12.921747 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d22a6dc-319e-4b16-a8c8-486731e75394-kube-api-access-qffsr" (OuterVolumeSpecName: "kube-api-access-qffsr") pod "6d22a6dc-319e-4b16-a8c8-486731e75394" (UID: "6d22a6dc-319e-4b16-a8c8-486731e75394"). InnerVolumeSpecName "kube-api-access-qffsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:57:13 crc kubenswrapper[4974]: I1013 19:57:13.006539 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qffsr\" (UniqueName: \"kubernetes.io/projected/6d22a6dc-319e-4b16-a8c8-486731e75394-kube-api-access-qffsr\") on node \"crc\" DevicePath \"\"" Oct 13 19:57:13 crc kubenswrapper[4974]: I1013 19:57:13.678807 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" event={"ID":"6d22a6dc-319e-4b16-a8c8-486731e75394","Type":"ContainerDied","Data":"0731194b56f79a252fec7296d44092a4b41de3846740e6705bc95c6e140766cf"} Oct 13 19:57:13 crc kubenswrapper[4974]: I1013 19:57:13.679079 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0731194b56f79a252fec7296d44092a4b41de3846740e6705bc95c6e140766cf" Oct 13 19:57:13 crc kubenswrapper[4974]: I1013 19:57:13.678884 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/crc-debug-2sqzm" Oct 13 19:57:13 crc kubenswrapper[4974]: I1013 19:57:13.811370 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:57:13 crc kubenswrapper[4974]: E1013 19:57:13.811621 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:57:13 crc kubenswrapper[4974]: I1013 19:57:13.876242 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9wdjx/crc-debug-2sqzm"] Oct 13 19:57:13 crc kubenswrapper[4974]: I1013 19:57:13.886510 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9wdjx/crc-debug-2sqzm"] Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.032718 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-9wdjx/crc-debug-ms5pv"] Oct 13 19:57:15 crc kubenswrapper[4974]: E1013 19:57:15.033393 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d22a6dc-319e-4b16-a8c8-486731e75394" containerName="container-00" Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.033405 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d22a6dc-319e-4b16-a8c8-486731e75394" containerName="container-00" Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.033638 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d22a6dc-319e-4b16-a8c8-486731e75394" containerName="container-00" Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.034528 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.152914 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c03b1ed6-355e-4301-8165-2224a91eb978-host\") pod \"crc-debug-ms5pv\" (UID: \"c03b1ed6-355e-4301-8165-2224a91eb978\") " pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.153068 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsrbn\" (UniqueName: \"kubernetes.io/projected/c03b1ed6-355e-4301-8165-2224a91eb978-kube-api-access-hsrbn\") pod \"crc-debug-ms5pv\" (UID: \"c03b1ed6-355e-4301-8165-2224a91eb978\") " pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.255092 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsrbn\" (UniqueName: \"kubernetes.io/projected/c03b1ed6-355e-4301-8165-2224a91eb978-kube-api-access-hsrbn\") pod \"crc-debug-ms5pv\" (UID: \"c03b1ed6-355e-4301-8165-2224a91eb978\") " pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.255284 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c03b1ed6-355e-4301-8165-2224a91eb978-host\") pod \"crc-debug-ms5pv\" (UID: \"c03b1ed6-355e-4301-8165-2224a91eb978\") " pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.255426 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c03b1ed6-355e-4301-8165-2224a91eb978-host\") pod \"crc-debug-ms5pv\" (UID: \"c03b1ed6-355e-4301-8165-2224a91eb978\") " pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.299695 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsrbn\" (UniqueName: \"kubernetes.io/projected/c03b1ed6-355e-4301-8165-2224a91eb978-kube-api-access-hsrbn\") pod \"crc-debug-ms5pv\" (UID: \"c03b1ed6-355e-4301-8165-2224a91eb978\") " pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.359244 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" Oct 13 19:57:15 crc kubenswrapper[4974]: W1013 19:57:15.420458 4974 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc03b1ed6_355e_4301_8165_2224a91eb978.slice/crio-4dfe4442fc55bf5008904627df6eae0ebc43d5f668af0e3282ddc3a20a0b4ee1 WatchSource:0}: Error finding container 4dfe4442fc55bf5008904627df6eae0ebc43d5f668af0e3282ddc3a20a0b4ee1: Status 404 returned error can't find the container with id 4dfe4442fc55bf5008904627df6eae0ebc43d5f668af0e3282ddc3a20a0b4ee1 Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.699685 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" event={"ID":"c03b1ed6-355e-4301-8165-2224a91eb978","Type":"ContainerStarted","Data":"75a4e09b24560b88017a1595b8d03e492f9425ed5c0c8b56f8fe32f3ef1b4102"} Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.699978 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" event={"ID":"c03b1ed6-355e-4301-8165-2224a91eb978","Type":"ContainerStarted","Data":"4dfe4442fc55bf5008904627df6eae0ebc43d5f668af0e3282ddc3a20a0b4ee1"} Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.721714 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" podStartSLOduration=0.721695339 podStartE2EDuration="721.695339ms" podCreationTimestamp="2025-10-13 19:57:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 19:57:15.718562281 +0000 UTC m=+6170.622928371" watchObservedRunningTime="2025-10-13 19:57:15.721695339 +0000 UTC m=+6170.626061419" Oct 13 19:57:15 crc kubenswrapper[4974]: I1013 19:57:15.834848 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d22a6dc-319e-4b16-a8c8-486731e75394" path="/var/lib/kubelet/pods/6d22a6dc-319e-4b16-a8c8-486731e75394/volumes" Oct 13 19:57:16 crc kubenswrapper[4974]: I1013 19:57:16.713610 4974 generic.go:334] "Generic (PLEG): container finished" podID="c03b1ed6-355e-4301-8165-2224a91eb978" containerID="75a4e09b24560b88017a1595b8d03e492f9425ed5c0c8b56f8fe32f3ef1b4102" exitCode=0 Oct 13 19:57:16 crc kubenswrapper[4974]: I1013 19:57:16.713686 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" event={"ID":"c03b1ed6-355e-4301-8165-2224a91eb978","Type":"ContainerDied","Data":"75a4e09b24560b88017a1595b8d03e492f9425ed5c0c8b56f8fe32f3ef1b4102"} Oct 13 19:57:17 crc kubenswrapper[4974]: I1013 19:57:17.858404 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" Oct 13 19:57:17 crc kubenswrapper[4974]: I1013 19:57:17.900201 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9wdjx/crc-debug-ms5pv"] Oct 13 19:57:17 crc kubenswrapper[4974]: I1013 19:57:17.910732 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9wdjx/crc-debug-ms5pv"] Oct 13 19:57:17 crc kubenswrapper[4974]: I1013 19:57:17.911147 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsrbn\" (UniqueName: \"kubernetes.io/projected/c03b1ed6-355e-4301-8165-2224a91eb978-kube-api-access-hsrbn\") pod \"c03b1ed6-355e-4301-8165-2224a91eb978\" (UID: \"c03b1ed6-355e-4301-8165-2224a91eb978\") " Oct 13 19:57:17 crc kubenswrapper[4974]: I1013 19:57:17.911375 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c03b1ed6-355e-4301-8165-2224a91eb978-host\") pod \"c03b1ed6-355e-4301-8165-2224a91eb978\" (UID: \"c03b1ed6-355e-4301-8165-2224a91eb978\") " Oct 13 19:57:17 crc kubenswrapper[4974]: I1013 19:57:17.913028 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c03b1ed6-355e-4301-8165-2224a91eb978-host" (OuterVolumeSpecName: "host") pod "c03b1ed6-355e-4301-8165-2224a91eb978" (UID: "c03b1ed6-355e-4301-8165-2224a91eb978"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Oct 13 19:57:17 crc kubenswrapper[4974]: I1013 19:57:17.923105 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03b1ed6-355e-4301-8165-2224a91eb978-kube-api-access-hsrbn" (OuterVolumeSpecName: "kube-api-access-hsrbn") pod "c03b1ed6-355e-4301-8165-2224a91eb978" (UID: "c03b1ed6-355e-4301-8165-2224a91eb978"). InnerVolumeSpecName "kube-api-access-hsrbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:57:18 crc kubenswrapper[4974]: I1013 19:57:18.013749 4974 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c03b1ed6-355e-4301-8165-2224a91eb978-host\") on node \"crc\" DevicePath \"\"" Oct 13 19:57:18 crc kubenswrapper[4974]: I1013 19:57:18.014074 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsrbn\" (UniqueName: \"kubernetes.io/projected/c03b1ed6-355e-4301-8165-2224a91eb978-kube-api-access-hsrbn\") on node \"crc\" DevicePath \"\"" Oct 13 19:57:18 crc kubenswrapper[4974]: I1013 19:57:18.739263 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4dfe4442fc55bf5008904627df6eae0ebc43d5f668af0e3282ddc3a20a0b4ee1" Oct 13 19:57:18 crc kubenswrapper[4974]: I1013 19:57:18.739340 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/crc-debug-ms5pv" Oct 13 19:57:19 crc kubenswrapper[4974]: I1013 19:57:19.822774 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03b1ed6-355e-4301-8165-2224a91eb978" path="/var/lib/kubelet/pods/c03b1ed6-355e-4301-8165-2224a91eb978/volumes" Oct 13 19:57:27 crc kubenswrapper[4974]: I1013 19:57:27.812251 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:57:27 crc kubenswrapper[4974]: E1013 19:57:27.813070 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:57:42 crc kubenswrapper[4974]: I1013 19:57:42.812456 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:57:42 crc kubenswrapper[4974]: E1013 19:57:42.813198 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:57:43 crc kubenswrapper[4974]: I1013 19:57:43.514303 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-cf595f5c8-dtfck_b67ca997-2edf-492b-ab80-f618c7201a29/barbican-api/0.log" Oct 13 19:57:43 crc kubenswrapper[4974]: I1013 19:57:43.909014 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f69856764-9cjzr_45c18dbd-7083-463d-b845-f213bf6ae1ce/barbican-keystone-listener/0.log" Oct 13 19:57:43 crc kubenswrapper[4974]: I1013 19:57:43.951546 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-cf595f5c8-dtfck_b67ca997-2edf-492b-ab80-f618c7201a29/barbican-api-log/0.log" Oct 13 19:57:43 crc kubenswrapper[4974]: I1013 19:57:43.993669 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6f69856764-9cjzr_45c18dbd-7083-463d-b845-f213bf6ae1ce/barbican-keystone-listener-log/0.log" Oct 13 19:57:44 crc kubenswrapper[4974]: I1013 19:57:44.154506 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5b867446cf-7crxm_5acb3840-d265-46e7-8a2b-630f1bf38ec5/barbican-worker/0.log" Oct 13 19:57:44 crc kubenswrapper[4974]: I1013 19:57:44.168621 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5b867446cf-7crxm_5acb3840-d265-46e7-8a2b-630f1bf38ec5/barbican-worker-log/0.log" Oct 13 19:57:44 crc kubenswrapper[4974]: I1013 19:57:44.358138 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-89bkg_684a8cdf-df17-41a7-87b8-9027cb982025/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:44 crc kubenswrapper[4974]: I1013 19:57:44.508547 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cc0eb77b-ce25-42b6-a03f-600b090be522/ceilometer-notification-agent/0.log" Oct 13 19:57:44 crc kubenswrapper[4974]: I1013 19:57:44.528951 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cc0eb77b-ce25-42b6-a03f-600b090be522/ceilometer-central-agent/0.log" Oct 13 19:57:44 crc kubenswrapper[4974]: I1013 19:57:44.590790 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cc0eb77b-ce25-42b6-a03f-600b090be522/sg-core/0.log" Oct 13 19:57:44 crc kubenswrapper[4974]: I1013 19:57:44.633841 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_cc0eb77b-ce25-42b6-a03f-600b090be522/proxy-httpd/0.log" Oct 13 19:57:44 crc kubenswrapper[4974]: I1013 19:57:44.869175 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a4d89238-6f74-4da7-aa6d-1b6c5f56a204/cinder-api-log/0.log" Oct 13 19:57:45 crc kubenswrapper[4974]: I1013 19:57:45.145837 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_57963e15-5fff-4158-ad83-0e4bd2ca1f7f/probe/0.log" Oct 13 19:57:45 crc kubenswrapper[4974]: I1013 19:57:45.451371 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8a1a4b28-15ae-4fa5-8741-2d34b1062eee/probe/0.log" Oct 13 19:57:45 crc kubenswrapper[4974]: I1013 19:57:45.459784 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_8a1a4b28-15ae-4fa5-8741-2d34b1062eee/cinder-scheduler/0.log" Oct 13 19:57:45 crc kubenswrapper[4974]: I1013 19:57:45.914733 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e/probe/0.log" Oct 13 19:57:45 crc kubenswrapper[4974]: I1013 19:57:45.954166 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_57963e15-5fff-4158-ad83-0e4bd2ca1f7f/cinder-backup/0.log" Oct 13 19:57:46 crc kubenswrapper[4974]: I1013 19:57:46.114606 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a4d89238-6f74-4da7-aa6d-1b6c5f56a204/cinder-api/0.log" Oct 13 19:57:46 crc kubenswrapper[4974]: I1013 19:57:46.309189 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-0_90cb7cf6-4e7a-4583-a5d5-4ce3d0a3c16e/cinder-volume/0.log" Oct 13 19:57:46 crc kubenswrapper[4974]: I1013 19:57:46.394459 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_688ac6ef-82eb-421b-9949-a832b8a73319/probe/0.log" Oct 13 19:57:46 crc kubenswrapper[4974]: I1013 19:57:46.567389 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-9bgh9_fedc6dd9-1f4c-43f4-9e0b-74292be529a6/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:46 crc kubenswrapper[4974]: I1013 19:57:46.638166 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-nfs-2-0_688ac6ef-82eb-421b-9949-a832b8a73319/cinder-volume/0.log" Oct 13 19:57:46 crc kubenswrapper[4974]: I1013 19:57:46.699148 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-4vs4v_177f015a-482d-4058-a475-e6f787c7c1e5/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:46 crc kubenswrapper[4974]: I1013 19:57:46.792879 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-5dmzh_af61911c-89bc-4e8c-a327-6c1bab3c7d5d/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:46 crc kubenswrapper[4974]: I1013 19:57:46.864816 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7d59b7cdcf-mbsgm_83dacb6d-48a4-400a-9edb-74a61b3bf83f/init/0.log" Oct 13 19:57:47 crc kubenswrapper[4974]: I1013 19:57:47.012558 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7d59b7cdcf-mbsgm_83dacb6d-48a4-400a-9edb-74a61b3bf83f/init/0.log" Oct 13 19:57:47 crc kubenswrapper[4974]: I1013 19:57:47.129428 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-8lbms_dec36c0a-5335-4f2c-9582-ccd2c8f30207/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:47 crc kubenswrapper[4974]: I1013 19:57:47.197047 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-7d59b7cdcf-mbsgm_83dacb6d-48a4-400a-9edb-74a61b3bf83f/dnsmasq-dns/0.log" Oct 13 19:57:47 crc kubenswrapper[4974]: I1013 19:57:47.261626 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c989ea63-17f9-4aca-a407-9e07cbb1a04c/glance-httpd/0.log" Oct 13 19:57:47 crc kubenswrapper[4974]: I1013 19:57:47.306810 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_c989ea63-17f9-4aca-a407-9e07cbb1a04c/glance-log/0.log" Oct 13 19:57:47 crc kubenswrapper[4974]: I1013 19:57:47.401221 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7f9b8239-9b8d-4e59-8ba6-b7d8b5959248/glance-log/0.log" Oct 13 19:57:47 crc kubenswrapper[4974]: I1013 19:57:47.402016 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7f9b8239-9b8d-4e59-8ba6-b7d8b5959248/glance-httpd/0.log" Oct 13 19:57:47 crc kubenswrapper[4974]: I1013 19:57:47.548367 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6dcbf4cfcd-l89jc_003d2222-76eb-4a8c-b7c2-f201e88c542d/horizon/0.log" Oct 13 19:57:47 crc kubenswrapper[4974]: I1013 19:57:47.616134 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-wpc86_d272e4d0-84bf-4909-af41-81fe1f14bfcb/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:47 crc kubenswrapper[4974]: I1013 19:57:47.785946 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mwhx7_7e2087d1-027f-4fc7-8a75-5421f0e55868/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:48 crc kubenswrapper[4974]: I1013 19:57:48.029605 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29339701-wr52s_d52747d1-422c-40d4-ae78-d45dafcf9cbf/keystone-cron/0.log" Oct 13 19:57:48 crc kubenswrapper[4974]: I1013 19:57:48.493523 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_8609b0d5-280b-498a-88da-2de3c7e27605/kube-state-metrics/0.log" Oct 13 19:57:48 crc kubenswrapper[4974]: I1013 19:57:48.544010 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6dcbf4cfcd-l89jc_003d2222-76eb-4a8c-b7c2-f201e88c542d/horizon-log/0.log" Oct 13 19:57:48 crc kubenswrapper[4974]: I1013 19:57:48.605493 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-6f98cf4cc8-pgzsc_63d27b98-4a56-4fbc-a0c6-dd31bd3dfc52/keystone-api/0.log" Oct 13 19:57:48 crc kubenswrapper[4974]: I1013 19:57:48.654809 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-bpglt_dc0e7077-837e-4e51-a095-60eed2b94a51/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:49 crc kubenswrapper[4974]: I1013 19:57:49.101612 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-n5r72_4d0a5311-b8dc-4ba7-bf71-d43a4a0454ee/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:49 crc kubenswrapper[4974]: I1013 19:57:49.198883 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8475fc656f-dnpll_59a33676-139f-4010-ab9a-25832163ab83/neutron-api/0.log" Oct 13 19:57:49 crc kubenswrapper[4974]: I1013 19:57:49.218174 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8475fc656f-dnpll_59a33676-139f-4010-ab9a-25832163ab83/neutron-httpd/0.log" Oct 13 19:57:49 crc kubenswrapper[4974]: I1013 19:57:49.849060 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_be594e70-2775-4c06-a266-b2fcaf428134/nova-cell0-conductor-conductor/0.log" Oct 13 19:57:50 crc kubenswrapper[4974]: I1013 19:57:50.112854 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_004c8db1-b15c-43d1-b988-92d779aaebb2/nova-cell1-conductor-conductor/0.log" Oct 13 19:57:50 crc kubenswrapper[4974]: I1013 19:57:50.480316 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_86eb0f8b-123d-4c2a-b7c6-d0a613625ee8/nova-cell1-novncproxy-novncproxy/0.log" Oct 13 19:57:50 crc kubenswrapper[4974]: I1013 19:57:50.680235 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-mj759_533bff4f-cd80-4893-95e8-404276a2e0d0/nova-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:50 crc kubenswrapper[4974]: I1013 19:57:50.959480 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_420e7e88-d552-4dd6-b5f8-b8ec9d8b9354/nova-api-log/0.log" Oct 13 19:57:51 crc kubenswrapper[4974]: I1013 19:57:51.002449 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_46af3042-50b4-462e-9449-4d521fd32afa/nova-metadata-log/0.log" Oct 13 19:57:51 crc kubenswrapper[4974]: I1013 19:57:51.499026 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_420e7e88-d552-4dd6-b5f8-b8ec9d8b9354/nova-api-api/0.log" Oct 13 19:57:51 crc kubenswrapper[4974]: I1013 19:57:51.599336 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0c7ef8b9-b24d-4ddf-b764-41cbd10095e8/mysql-bootstrap/0.log" Oct 13 19:57:51 crc kubenswrapper[4974]: I1013 19:57:51.622695 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_a5915d47-3416-4678-8589-31ea94154b54/nova-scheduler-scheduler/0.log" Oct 13 19:57:51 crc kubenswrapper[4974]: I1013 19:57:51.764366 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0c7ef8b9-b24d-4ddf-b764-41cbd10095e8/mysql-bootstrap/0.log" Oct 13 19:57:51 crc kubenswrapper[4974]: I1013 19:57:51.926621 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_0c7ef8b9-b24d-4ddf-b764-41cbd10095e8/galera/0.log" Oct 13 19:57:52 crc kubenswrapper[4974]: I1013 19:57:52.083610 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_72dd5704-4623-4186-8914-512c4ea61a5b/mysql-bootstrap/0.log" Oct 13 19:57:52 crc kubenswrapper[4974]: I1013 19:57:52.233726 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_72dd5704-4623-4186-8914-512c4ea61a5b/mysql-bootstrap/0.log" Oct 13 19:57:52 crc kubenswrapper[4974]: I1013 19:57:52.366796 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_72dd5704-4623-4186-8914-512c4ea61a5b/galera/0.log" Oct 13 19:57:52 crc kubenswrapper[4974]: I1013 19:57:52.487312 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_e9e672e2-a15c-4cfa-b751-c6208182f2c7/openstackclient/0.log" Oct 13 19:57:52 crc kubenswrapper[4974]: I1013 19:57:52.872649 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-tvbtp_2d4633f8-1f68-4d44-8569-69f02e1886f3/openstack-network-exporter/0.log" Oct 13 19:57:53 crc kubenswrapper[4974]: I1013 19:57:53.022891 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cxs58_d24d9e9c-90a1-490b-80d9-4d36d6050083/ovsdb-server-init/0.log" Oct 13 19:57:53 crc kubenswrapper[4974]: I1013 19:57:53.204087 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cxs58_d24d9e9c-90a1-490b-80d9-4d36d6050083/ovsdb-server-init/0.log" Oct 13 19:57:53 crc kubenswrapper[4974]: I1013 19:57:53.271745 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cxs58_d24d9e9c-90a1-490b-80d9-4d36d6050083/ovsdb-server/0.log" Oct 13 19:57:53 crc kubenswrapper[4974]: I1013 19:57:53.533185 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-r5pdv_c233290a-abb8-4429-8500-f4ec541ccc21/ovn-controller/0.log" Oct 13 19:57:53 crc kubenswrapper[4974]: I1013 19:57:53.577588 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_46af3042-50b4-462e-9449-4d521fd32afa/nova-metadata-metadata/0.log" Oct 13 19:57:53 crc kubenswrapper[4974]: I1013 19:57:53.644958 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cxs58_d24d9e9c-90a1-490b-80d9-4d36d6050083/ovs-vswitchd/0.log" Oct 13 19:57:53 crc kubenswrapper[4974]: I1013 19:57:53.773464 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-5wshh_a0a9ad72-5d41-4b79-8d65-797ed063b530/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:53 crc kubenswrapper[4974]: I1013 19:57:53.822992 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_97fd37f3-ee6b-4a37-a32b-057d21edf416/openstack-network-exporter/0.log" Oct 13 19:57:53 crc kubenswrapper[4974]: I1013 19:57:53.897614 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_97fd37f3-ee6b-4a37-a32b-057d21edf416/ovn-northd/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.068531 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_def34e48-c96a-4074-8780-44ba062e6816/memcached/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.115081 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eede053f-a3cf-4af6-9f1a-7458bec6f5a3/ovsdbserver-nb/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.116104 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_eede053f-a3cf-4af6-9f1a-7458bec6f5a3/openstack-network-exporter/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.261139 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_63c7b046-cd5e-42e0-b295-ca90bb6a53c9/openstack-network-exporter/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.323229 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_63c7b046-cd5e-42e0-b295-ca90bb6a53c9/ovsdbserver-sb/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.496169 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6b66599996-gvfwf_c6c335f4-044a-4970-8a80-05755d65b00a/placement-api/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.579467 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_24cb3219-26af-43ab-95da-2320e69129db/init-config-reloader/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.641977 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6b66599996-gvfwf_c6c335f4-044a-4970-8a80-05755d65b00a/placement-log/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.755823 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_24cb3219-26af-43ab-95da-2320e69129db/init-config-reloader/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.776671 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_24cb3219-26af-43ab-95da-2320e69129db/config-reloader/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.776866 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_24cb3219-26af-43ab-95da-2320e69129db/prometheus/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.824215 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_24cb3219-26af-43ab-95da-2320e69129db/thanos-sidecar/0.log" Oct 13 19:57:54 crc kubenswrapper[4974]: I1013 19:57:54.945583 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4bf0b2fe-061e-486f-9e0f-96bd13bc7eae/setup-container/0.log" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.095067 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4bf0b2fe-061e-486f-9e0f-96bd13bc7eae/setup-container/0.log" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.121216 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_a3edaa1a-d213-473f-963a-3bfea41226ec/setup-container/0.log" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.135995 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4bf0b2fe-061e-486f-9e0f-96bd13bc7eae/rabbitmq/0.log" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.369021 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_a3edaa1a-d213-473f-963a-3bfea41226ec/setup-container/0.log" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.377470 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4b2e987b-fd90-420d-86f1-b9757dd40b03/setup-container/0.log" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.378335 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5xkjq"] Oct 13 19:57:55 crc kubenswrapper[4974]: E1013 19:57:55.378736 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c03b1ed6-355e-4301-8165-2224a91eb978" containerName="container-00" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.378752 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="c03b1ed6-355e-4301-8165-2224a91eb978" containerName="container-00" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.378991 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="c03b1ed6-355e-4301-8165-2224a91eb978" containerName="container-00" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.380399 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.390217 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5xkjq"] Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.447367 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_a3edaa1a-d213-473f-963a-3bfea41226ec/rabbitmq/0.log" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.492786 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dt62g\" (UniqueName: \"kubernetes.io/projected/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-kube-api-access-dt62g\") pod \"community-operators-5xkjq\" (UID: \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\") " pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.492881 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-catalog-content\") pod \"community-operators-5xkjq\" (UID: \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\") " pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.492947 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-utilities\") pod \"community-operators-5xkjq\" (UID: \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\") " pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.594815 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-catalog-content\") pod \"community-operators-5xkjq\" (UID: \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\") " pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.594895 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-utilities\") pod \"community-operators-5xkjq\" (UID: \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\") " pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.594984 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dt62g\" (UniqueName: \"kubernetes.io/projected/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-kube-api-access-dt62g\") pod \"community-operators-5xkjq\" (UID: \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\") " pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.595386 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-catalog-content\") pod \"community-operators-5xkjq\" (UID: \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\") " pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.595478 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-utilities\") pod \"community-operators-5xkjq\" (UID: \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\") " pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.626835 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dt62g\" (UniqueName: \"kubernetes.io/projected/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-kube-api-access-dt62g\") pod \"community-operators-5xkjq\" (UID: \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\") " pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.673948 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-g4h5r_fc6631d2-7807-4296-8806-da8155c0992e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.704103 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.795454 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4b2e987b-fd90-420d-86f1-b9757dd40b03/rabbitmq/0.log" Oct 13 19:57:55 crc kubenswrapper[4974]: I1013 19:57:55.916883 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_4b2e987b-fd90-420d-86f1-b9757dd40b03/setup-container/0.log" Oct 13 19:57:56 crc kubenswrapper[4974]: I1013 19:57:56.103871 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-fg6xw_4f6b26b6-df93-48fd-bbec-18aa5a371db8/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:56 crc kubenswrapper[4974]: I1013 19:57:56.211282 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-cmccw_127007fe-96b5-4741-b207-af9ec05b68da/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:56 crc kubenswrapper[4974]: I1013 19:57:56.260759 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-b9jfx_3d236165-9044-430d-92cf-33e4eadd281f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:56 crc kubenswrapper[4974]: I1013 19:57:56.292495 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5xkjq"] Oct 13 19:57:56 crc kubenswrapper[4974]: I1013 19:57:56.461139 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-k7bxc_37c5496c-447a-4806-81b1-15f11c4d057e/ssh-known-hosts-edpm-deployment/0.log" Oct 13 19:57:56 crc kubenswrapper[4974]: I1013 19:57:56.655782 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b5b857db9-xmtpd_835d1e2d-e4b2-47d5-89a2-ef955e650cc1/proxy-server/0.log" Oct 13 19:57:56 crc kubenswrapper[4974]: I1013 19:57:56.728792 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b5b857db9-xmtpd_835d1e2d-e4b2-47d5-89a2-ef955e650cc1/proxy-httpd/0.log" Oct 13 19:57:56 crc kubenswrapper[4974]: I1013 19:57:56.812829 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:57:56 crc kubenswrapper[4974]: E1013 19:57:56.813079 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:57:56 crc kubenswrapper[4974]: I1013 19:57:56.855488 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-7phmq_6aff3b1c-df57-4faf-9c6b-1009d5090a13/swift-ring-rebalance/0.log" Oct 13 19:57:56 crc kubenswrapper[4974]: I1013 19:57:56.965474 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/account-auditor/0.log" Oct 13 19:57:56 crc kubenswrapper[4974]: I1013 19:57:56.986267 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/account-reaper/0.log" Oct 13 19:57:56 crc kubenswrapper[4974]: I1013 19:57:56.992405 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/account-replicator/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.069791 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/account-server/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.132873 4974 generic.go:334] "Generic (PLEG): container finished" podID="fa90ce56-f0c9-4861-96fb-380e4fd8ed50" containerID="cbe3e5b981a2a9129cae3a5f88d3f2da8e60584a1dee995b104ce3e5ca06190c" exitCode=0 Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.132924 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xkjq" event={"ID":"fa90ce56-f0c9-4861-96fb-380e4fd8ed50","Type":"ContainerDied","Data":"cbe3e5b981a2a9129cae3a5f88d3f2da8e60584a1dee995b104ce3e5ca06190c"} Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.132968 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xkjq" event={"ID":"fa90ce56-f0c9-4861-96fb-380e4fd8ed50","Type":"ContainerStarted","Data":"def170c7254e0e4f5744bd49d3b2a9f85c9cddc6d789d6f376413c9983270c0c"} Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.175569 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/container-auditor/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.257965 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/container-server/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.258735 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/container-updater/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.293552 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/object-auditor/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.307509 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/container-replicator/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.377169 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/object-expirer/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.498611 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/object-replicator/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.502279 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/object-updater/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.503025 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/object-server/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.518381 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/rsync/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.600934 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_d7e32e29-6d51-4230-b7d5-911b0787a900/swift-recon-cron/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.780637 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_98e331dd-24d4-4707-b432-557ea90e6048/tempest-tests-tempest-tests-runner/0.log" Oct 13 19:57:57 crc kubenswrapper[4974]: I1013 19:57:57.793401 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-rdhwr_a947ab95-2720-4cda-a618-470943b7443c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:58 crc kubenswrapper[4974]: I1013 19:57:58.047013 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e6f8e8b4-a194-4de6-b1c0-9b5b183136c5/test-operator-logs-container/0.log" Oct 13 19:57:58 crc kubenswrapper[4974]: I1013 19:57:58.089453 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-z8f4t_f2eee5ad-fe26-46b2-af3c-1477c1513609/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Oct 13 19:57:58 crc kubenswrapper[4974]: I1013 19:57:58.143101 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xkjq" event={"ID":"fa90ce56-f0c9-4861-96fb-380e4fd8ed50","Type":"ContainerStarted","Data":"7b2d2a9abce25802bbd52a9b743138494fec14db5870494fb29c5dc74fd1e259"} Oct 13 19:57:58 crc kubenswrapper[4974]: I1013 19:57:58.833925 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_dc2497ce-b7ed-481e-88c0-eb2e7aef34f9/watcher-applier/0.log" Oct 13 19:57:59 crc kubenswrapper[4974]: I1013 19:57:59.162086 4974 generic.go:334] "Generic (PLEG): container finished" podID="fa90ce56-f0c9-4861-96fb-380e4fd8ed50" containerID="7b2d2a9abce25802bbd52a9b743138494fec14db5870494fb29c5dc74fd1e259" exitCode=0 Oct 13 19:57:59 crc kubenswrapper[4974]: I1013 19:57:59.162127 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xkjq" event={"ID":"fa90ce56-f0c9-4861-96fb-380e4fd8ed50","Type":"ContainerDied","Data":"7b2d2a9abce25802bbd52a9b743138494fec14db5870494fb29c5dc74fd1e259"} Oct 13 19:57:59 crc kubenswrapper[4974]: I1013 19:57:59.361738 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_5c387632-008a-4609-8b64-ff84c35596c7/watcher-api-log/0.log" Oct 13 19:58:00 crc kubenswrapper[4974]: I1013 19:58:00.176866 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xkjq" event={"ID":"fa90ce56-f0c9-4861-96fb-380e4fd8ed50","Type":"ContainerStarted","Data":"1407e6a9918df138b389f1540f10983eddb65c4a344d1842cb23128e5bd88826"} Oct 13 19:58:00 crc kubenswrapper[4974]: I1013 19:58:00.241924 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5xkjq" podStartSLOduration=2.760400503 podStartE2EDuration="5.241903086s" podCreationTimestamp="2025-10-13 19:57:55 +0000 UTC" firstStartedPulling="2025-10-13 19:57:57.134302826 +0000 UTC m=+6212.038668906" lastFinishedPulling="2025-10-13 19:57:59.615805409 +0000 UTC m=+6214.520171489" observedRunningTime="2025-10-13 19:58:00.222118049 +0000 UTC m=+6215.126484129" watchObservedRunningTime="2025-10-13 19:58:00.241903086 +0000 UTC m=+6215.146269176" Oct 13 19:58:01 crc kubenswrapper[4974]: I1013 19:58:01.817844 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_b6ea71d0-a795-4a73-9108-dc8e4a3e4187/watcher-decision-engine/0.log" Oct 13 19:58:02 crc kubenswrapper[4974]: I1013 19:58:02.621975 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_5c387632-008a-4609-8b64-ff84c35596c7/watcher-api/0.log" Oct 13 19:58:05 crc kubenswrapper[4974]: I1013 19:58:05.704701 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:58:05 crc kubenswrapper[4974]: I1013 19:58:05.706160 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:58:05 crc kubenswrapper[4974]: I1013 19:58:05.762556 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:58:06 crc kubenswrapper[4974]: I1013 19:58:06.322984 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:58:06 crc kubenswrapper[4974]: I1013 19:58:06.380878 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5xkjq"] Oct 13 19:58:08 crc kubenswrapper[4974]: I1013 19:58:08.264299 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5xkjq" podUID="fa90ce56-f0c9-4861-96fb-380e4fd8ed50" containerName="registry-server" containerID="cri-o://1407e6a9918df138b389f1540f10983eddb65c4a344d1842cb23128e5bd88826" gracePeriod=2 Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.276477 4974 generic.go:334] "Generic (PLEG): container finished" podID="fa90ce56-f0c9-4861-96fb-380e4fd8ed50" containerID="1407e6a9918df138b389f1540f10983eddb65c4a344d1842cb23128e5bd88826" exitCode=0 Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.276560 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xkjq" event={"ID":"fa90ce56-f0c9-4861-96fb-380e4fd8ed50","Type":"ContainerDied","Data":"1407e6a9918df138b389f1540f10983eddb65c4a344d1842cb23128e5bd88826"} Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.276811 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5xkjq" event={"ID":"fa90ce56-f0c9-4861-96fb-380e4fd8ed50","Type":"ContainerDied","Data":"def170c7254e0e4f5744bd49d3b2a9f85c9cddc6d789d6f376413c9983270c0c"} Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.276833 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="def170c7254e0e4f5744bd49d3b2a9f85c9cddc6d789d6f376413c9983270c0c" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.285514 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.390383 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-utilities\") pod \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\" (UID: \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\") " Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.390515 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dt62g\" (UniqueName: \"kubernetes.io/projected/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-kube-api-access-dt62g\") pod \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\" (UID: \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\") " Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.390587 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-catalog-content\") pod \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\" (UID: \"fa90ce56-f0c9-4861-96fb-380e4fd8ed50\") " Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.391286 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-utilities" (OuterVolumeSpecName: "utilities") pod "fa90ce56-f0c9-4861-96fb-380e4fd8ed50" (UID: "fa90ce56-f0c9-4861-96fb-380e4fd8ed50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.396509 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-kube-api-access-dt62g" (OuterVolumeSpecName: "kube-api-access-dt62g") pod "fa90ce56-f0c9-4861-96fb-380e4fd8ed50" (UID: "fa90ce56-f0c9-4861-96fb-380e4fd8ed50"). InnerVolumeSpecName "kube-api-access-dt62g". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.450065 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa90ce56-f0c9-4861-96fb-380e4fd8ed50" (UID: "fa90ce56-f0c9-4861-96fb-380e4fd8ed50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.493155 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dt62g\" (UniqueName: \"kubernetes.io/projected/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-kube-api-access-dt62g\") on node \"crc\" DevicePath \"\"" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.493188 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.493198 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa90ce56-f0c9-4861-96fb-380e4fd8ed50-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.606496 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8vll5"] Oct 13 19:58:09 crc kubenswrapper[4974]: E1013 19:58:09.606959 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa90ce56-f0c9-4861-96fb-380e4fd8ed50" containerName="extract-utilities" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.606975 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa90ce56-f0c9-4861-96fb-380e4fd8ed50" containerName="extract-utilities" Oct 13 19:58:09 crc kubenswrapper[4974]: E1013 19:58:09.606986 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa90ce56-f0c9-4861-96fb-380e4fd8ed50" containerName="registry-server" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.606995 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa90ce56-f0c9-4861-96fb-380e4fd8ed50" containerName="registry-server" Oct 13 19:58:09 crc kubenswrapper[4974]: E1013 19:58:09.607007 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa90ce56-f0c9-4861-96fb-380e4fd8ed50" containerName="extract-content" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.607014 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa90ce56-f0c9-4861-96fb-380e4fd8ed50" containerName="extract-content" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.607211 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa90ce56-f0c9-4861-96fb-380e4fd8ed50" containerName="registry-server" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.608617 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.640522 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8vll5"] Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.697550 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1a5dacf-0328-45cb-a9e4-98698a097c89-utilities\") pod \"certified-operators-8vll5\" (UID: \"b1a5dacf-0328-45cb-a9e4-98698a097c89\") " pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.697620 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kmnz\" (UniqueName: \"kubernetes.io/projected/b1a5dacf-0328-45cb-a9e4-98698a097c89-kube-api-access-5kmnz\") pod \"certified-operators-8vll5\" (UID: \"b1a5dacf-0328-45cb-a9e4-98698a097c89\") " pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.697673 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1a5dacf-0328-45cb-a9e4-98698a097c89-catalog-content\") pod \"certified-operators-8vll5\" (UID: \"b1a5dacf-0328-45cb-a9e4-98698a097c89\") " pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.799235 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kmnz\" (UniqueName: \"kubernetes.io/projected/b1a5dacf-0328-45cb-a9e4-98698a097c89-kube-api-access-5kmnz\") pod \"certified-operators-8vll5\" (UID: \"b1a5dacf-0328-45cb-a9e4-98698a097c89\") " pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.799281 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1a5dacf-0328-45cb-a9e4-98698a097c89-catalog-content\") pod \"certified-operators-8vll5\" (UID: \"b1a5dacf-0328-45cb-a9e4-98698a097c89\") " pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.799444 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1a5dacf-0328-45cb-a9e4-98698a097c89-utilities\") pod \"certified-operators-8vll5\" (UID: \"b1a5dacf-0328-45cb-a9e4-98698a097c89\") " pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.799759 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1a5dacf-0328-45cb-a9e4-98698a097c89-catalog-content\") pod \"certified-operators-8vll5\" (UID: \"b1a5dacf-0328-45cb-a9e4-98698a097c89\") " pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.799794 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1a5dacf-0328-45cb-a9e4-98698a097c89-utilities\") pod \"certified-operators-8vll5\" (UID: \"b1a5dacf-0328-45cb-a9e4-98698a097c89\") " pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.827508 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kmnz\" (UniqueName: \"kubernetes.io/projected/b1a5dacf-0328-45cb-a9e4-98698a097c89-kube-api-access-5kmnz\") pod \"certified-operators-8vll5\" (UID: \"b1a5dacf-0328-45cb-a9e4-98698a097c89\") " pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:09 crc kubenswrapper[4974]: I1013 19:58:09.936159 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:10 crc kubenswrapper[4974]: I1013 19:58:10.288429 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5xkjq" Oct 13 19:58:10 crc kubenswrapper[4974]: I1013 19:58:10.317040 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5xkjq"] Oct 13 19:58:10 crc kubenswrapper[4974]: I1013 19:58:10.325783 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5xkjq"] Oct 13 19:58:10 crc kubenswrapper[4974]: I1013 19:58:10.460914 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8vll5"] Oct 13 19:58:10 crc kubenswrapper[4974]: I1013 19:58:10.811846 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:58:10 crc kubenswrapper[4974]: E1013 19:58:10.812509 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:58:11 crc kubenswrapper[4974]: I1013 19:58:11.297871 4974 generic.go:334] "Generic (PLEG): container finished" podID="b1a5dacf-0328-45cb-a9e4-98698a097c89" containerID="33534e6c4eb0735505b6de2374190a733c50cf4d6fcc80f959ba2e8ad75eeb86" exitCode=0 Oct 13 19:58:11 crc kubenswrapper[4974]: I1013 19:58:11.297902 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vll5" event={"ID":"b1a5dacf-0328-45cb-a9e4-98698a097c89","Type":"ContainerDied","Data":"33534e6c4eb0735505b6de2374190a733c50cf4d6fcc80f959ba2e8ad75eeb86"} Oct 13 19:58:11 crc kubenswrapper[4974]: I1013 19:58:11.298544 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vll5" event={"ID":"b1a5dacf-0328-45cb-a9e4-98698a097c89","Type":"ContainerStarted","Data":"1e4b03956cdcab8a672f69293bdcfd0887d0383d45977b02975d6a0f2e859bad"} Oct 13 19:58:11 crc kubenswrapper[4974]: I1013 19:58:11.300352 4974 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Oct 13 19:58:11 crc kubenswrapper[4974]: I1013 19:58:11.834614 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa90ce56-f0c9-4861-96fb-380e4fd8ed50" path="/var/lib/kubelet/pods/fa90ce56-f0c9-4861-96fb-380e4fd8ed50/volumes" Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.003549 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2ccrt"] Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.005867 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.021224 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2ccrt"] Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.054306 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377bd5f5-b4ae-43ea-b675-9c88662194b4-utilities\") pod \"redhat-marketplace-2ccrt\" (UID: \"377bd5f5-b4ae-43ea-b675-9c88662194b4\") " pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.054588 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377bd5f5-b4ae-43ea-b675-9c88662194b4-catalog-content\") pod \"redhat-marketplace-2ccrt\" (UID: \"377bd5f5-b4ae-43ea-b675-9c88662194b4\") " pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.054884 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s87m2\" (UniqueName: \"kubernetes.io/projected/377bd5f5-b4ae-43ea-b675-9c88662194b4-kube-api-access-s87m2\") pod \"redhat-marketplace-2ccrt\" (UID: \"377bd5f5-b4ae-43ea-b675-9c88662194b4\") " pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.156188 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377bd5f5-b4ae-43ea-b675-9c88662194b4-utilities\") pod \"redhat-marketplace-2ccrt\" (UID: \"377bd5f5-b4ae-43ea-b675-9c88662194b4\") " pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.156287 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377bd5f5-b4ae-43ea-b675-9c88662194b4-catalog-content\") pod \"redhat-marketplace-2ccrt\" (UID: \"377bd5f5-b4ae-43ea-b675-9c88662194b4\") " pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.156354 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s87m2\" (UniqueName: \"kubernetes.io/projected/377bd5f5-b4ae-43ea-b675-9c88662194b4-kube-api-access-s87m2\") pod \"redhat-marketplace-2ccrt\" (UID: \"377bd5f5-b4ae-43ea-b675-9c88662194b4\") " pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.157031 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377bd5f5-b4ae-43ea-b675-9c88662194b4-utilities\") pod \"redhat-marketplace-2ccrt\" (UID: \"377bd5f5-b4ae-43ea-b675-9c88662194b4\") " pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.157233 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377bd5f5-b4ae-43ea-b675-9c88662194b4-catalog-content\") pod \"redhat-marketplace-2ccrt\" (UID: \"377bd5f5-b4ae-43ea-b675-9c88662194b4\") " pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.176549 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s87m2\" (UniqueName: \"kubernetes.io/projected/377bd5f5-b4ae-43ea-b675-9c88662194b4-kube-api-access-s87m2\") pod \"redhat-marketplace-2ccrt\" (UID: \"377bd5f5-b4ae-43ea-b675-9c88662194b4\") " pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.311158 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vll5" event={"ID":"b1a5dacf-0328-45cb-a9e4-98698a097c89","Type":"ContainerStarted","Data":"ef339bb7a1baa5ae13212abd32360614f3eeeffd3bb6b644d71840fe8a39b4ee"} Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.338306 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:12 crc kubenswrapper[4974]: I1013 19:58:12.812925 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2ccrt"] Oct 13 19:58:13 crc kubenswrapper[4974]: I1013 19:58:13.321858 4974 generic.go:334] "Generic (PLEG): container finished" podID="377bd5f5-b4ae-43ea-b675-9c88662194b4" containerID="732a05608c63d0d4941a7d220e56cc1aa65f7607c45a26d11220a2996d6a5632" exitCode=0 Oct 13 19:58:13 crc kubenswrapper[4974]: I1013 19:58:13.321944 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2ccrt" event={"ID":"377bd5f5-b4ae-43ea-b675-9c88662194b4","Type":"ContainerDied","Data":"732a05608c63d0d4941a7d220e56cc1aa65f7607c45a26d11220a2996d6a5632"} Oct 13 19:58:13 crc kubenswrapper[4974]: I1013 19:58:13.322181 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2ccrt" event={"ID":"377bd5f5-b4ae-43ea-b675-9c88662194b4","Type":"ContainerStarted","Data":"29947529519b6486f122bb46b86f3014df2ee02cbaef7d98c3123dae99e2e142"} Oct 13 19:58:14 crc kubenswrapper[4974]: I1013 19:58:14.335571 4974 generic.go:334] "Generic (PLEG): container finished" podID="b1a5dacf-0328-45cb-a9e4-98698a097c89" containerID="ef339bb7a1baa5ae13212abd32360614f3eeeffd3bb6b644d71840fe8a39b4ee" exitCode=0 Oct 13 19:58:14 crc kubenswrapper[4974]: I1013 19:58:14.335642 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vll5" event={"ID":"b1a5dacf-0328-45cb-a9e4-98698a097c89","Type":"ContainerDied","Data":"ef339bb7a1baa5ae13212abd32360614f3eeeffd3bb6b644d71840fe8a39b4ee"} Oct 13 19:58:14 crc kubenswrapper[4974]: I1013 19:58:14.339032 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2ccrt" event={"ID":"377bd5f5-b4ae-43ea-b675-9c88662194b4","Type":"ContainerStarted","Data":"c67f59501a41881f779011104fed2fc4ab786355ae8c7052f0fda9d23bdb0526"} Oct 13 19:58:15 crc kubenswrapper[4974]: I1013 19:58:15.350775 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vll5" event={"ID":"b1a5dacf-0328-45cb-a9e4-98698a097c89","Type":"ContainerStarted","Data":"945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909"} Oct 13 19:58:15 crc kubenswrapper[4974]: I1013 19:58:15.353060 4974 generic.go:334] "Generic (PLEG): container finished" podID="377bd5f5-b4ae-43ea-b675-9c88662194b4" containerID="c67f59501a41881f779011104fed2fc4ab786355ae8c7052f0fda9d23bdb0526" exitCode=0 Oct 13 19:58:15 crc kubenswrapper[4974]: I1013 19:58:15.353106 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2ccrt" event={"ID":"377bd5f5-b4ae-43ea-b675-9c88662194b4","Type":"ContainerDied","Data":"c67f59501a41881f779011104fed2fc4ab786355ae8c7052f0fda9d23bdb0526"} Oct 13 19:58:15 crc kubenswrapper[4974]: I1013 19:58:15.374159 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8vll5" podStartSLOduration=2.623099079 podStartE2EDuration="6.374141184s" podCreationTimestamp="2025-10-13 19:58:09 +0000 UTC" firstStartedPulling="2025-10-13 19:58:11.300151277 +0000 UTC m=+6226.204517357" lastFinishedPulling="2025-10-13 19:58:15.051193382 +0000 UTC m=+6229.955559462" observedRunningTime="2025-10-13 19:58:15.36689594 +0000 UTC m=+6230.271262020" watchObservedRunningTime="2025-10-13 19:58:15.374141184 +0000 UTC m=+6230.278507264" Oct 13 19:58:16 crc kubenswrapper[4974]: I1013 19:58:16.365308 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2ccrt" event={"ID":"377bd5f5-b4ae-43ea-b675-9c88662194b4","Type":"ContainerStarted","Data":"5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208"} Oct 13 19:58:16 crc kubenswrapper[4974]: I1013 19:58:16.392772 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2ccrt" podStartSLOduration=2.954141344 podStartE2EDuration="5.392756541s" podCreationTimestamp="2025-10-13 19:58:11 +0000 UTC" firstStartedPulling="2025-10-13 19:58:13.323518991 +0000 UTC m=+6228.227885071" lastFinishedPulling="2025-10-13 19:58:15.762134198 +0000 UTC m=+6230.666500268" observedRunningTime="2025-10-13 19:58:16.388406009 +0000 UTC m=+6231.292772089" watchObservedRunningTime="2025-10-13 19:58:16.392756541 +0000 UTC m=+6231.297122621" Oct 13 19:58:19 crc kubenswrapper[4974]: I1013 19:58:19.936840 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:19 crc kubenswrapper[4974]: I1013 19:58:19.937217 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:20 crc kubenswrapper[4974]: I1013 19:58:20.025778 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:20 crc kubenswrapper[4974]: I1013 19:58:20.511105 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:21 crc kubenswrapper[4974]: I1013 19:58:21.399235 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8vll5"] Oct 13 19:58:21 crc kubenswrapper[4974]: I1013 19:58:21.812909 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:58:21 crc kubenswrapper[4974]: E1013 19:58:21.814010 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:58:22 crc kubenswrapper[4974]: I1013 19:58:22.338842 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:22 crc kubenswrapper[4974]: I1013 19:58:22.339803 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:22 crc kubenswrapper[4974]: I1013 19:58:22.400201 4974 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:22 crc kubenswrapper[4974]: I1013 19:58:22.448908 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8vll5" podUID="b1a5dacf-0328-45cb-a9e4-98698a097c89" containerName="registry-server" containerID="cri-o://945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909" gracePeriod=2 Oct 13 19:58:22 crc kubenswrapper[4974]: I1013 19:58:22.525245 4974 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:22 crc kubenswrapper[4974]: I1013 19:58:22.993269 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.154261 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1a5dacf-0328-45cb-a9e4-98698a097c89-utilities\") pod \"b1a5dacf-0328-45cb-a9e4-98698a097c89\" (UID: \"b1a5dacf-0328-45cb-a9e4-98698a097c89\") " Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.154520 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kmnz\" (UniqueName: \"kubernetes.io/projected/b1a5dacf-0328-45cb-a9e4-98698a097c89-kube-api-access-5kmnz\") pod \"b1a5dacf-0328-45cb-a9e4-98698a097c89\" (UID: \"b1a5dacf-0328-45cb-a9e4-98698a097c89\") " Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.154566 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1a5dacf-0328-45cb-a9e4-98698a097c89-catalog-content\") pod \"b1a5dacf-0328-45cb-a9e4-98698a097c89\" (UID: \"b1a5dacf-0328-45cb-a9e4-98698a097c89\") " Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.155356 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1a5dacf-0328-45cb-a9e4-98698a097c89-utilities" (OuterVolumeSpecName: "utilities") pod "b1a5dacf-0328-45cb-a9e4-98698a097c89" (UID: "b1a5dacf-0328-45cb-a9e4-98698a097c89"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.164365 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1a5dacf-0328-45cb-a9e4-98698a097c89-kube-api-access-5kmnz" (OuterVolumeSpecName: "kube-api-access-5kmnz") pod "b1a5dacf-0328-45cb-a9e4-98698a097c89" (UID: "b1a5dacf-0328-45cb-a9e4-98698a097c89"). InnerVolumeSpecName "kube-api-access-5kmnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.215630 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1a5dacf-0328-45cb-a9e4-98698a097c89-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1a5dacf-0328-45cb-a9e4-98698a097c89" (UID: "b1a5dacf-0328-45cb-a9e4-98698a097c89"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.256518 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1a5dacf-0328-45cb-a9e4-98698a097c89-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.256699 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kmnz\" (UniqueName: \"kubernetes.io/projected/b1a5dacf-0328-45cb-a9e4-98698a097c89-kube-api-access-5kmnz\") on node \"crc\" DevicePath \"\"" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.256803 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1a5dacf-0328-45cb-a9e4-98698a097c89-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.462936 4974 generic.go:334] "Generic (PLEG): container finished" podID="b1a5dacf-0328-45cb-a9e4-98698a097c89" containerID="945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909" exitCode=0 Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.463008 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vll5" event={"ID":"b1a5dacf-0328-45cb-a9e4-98698a097c89","Type":"ContainerDied","Data":"945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909"} Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.463025 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vll5" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.464521 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vll5" event={"ID":"b1a5dacf-0328-45cb-a9e4-98698a097c89","Type":"ContainerDied","Data":"1e4b03956cdcab8a672f69293bdcfd0887d0383d45977b02975d6a0f2e859bad"} Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.464550 4974 scope.go:117] "RemoveContainer" containerID="945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.520953 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8vll5"] Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.550703 4974 scope.go:117] "RemoveContainer" containerID="ef339bb7a1baa5ae13212abd32360614f3eeeffd3bb6b644d71840fe8a39b4ee" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.564869 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8vll5"] Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.578840 4974 scope.go:117] "RemoveContainer" containerID="33534e6c4eb0735505b6de2374190a733c50cf4d6fcc80f959ba2e8ad75eeb86" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.631852 4974 scope.go:117] "RemoveContainer" containerID="945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909" Oct 13 19:58:23 crc kubenswrapper[4974]: E1013 19:58:23.632380 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909\": container with ID starting with 945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909 not found: ID does not exist" containerID="945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.632412 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909"} err="failed to get container status \"945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909\": rpc error: code = NotFound desc = could not find container \"945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909\": container with ID starting with 945d424e7a11c70c3cb6d4bff712a4dc5ead373c54e7c4530774870ac912e909 not found: ID does not exist" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.632440 4974 scope.go:117] "RemoveContainer" containerID="ef339bb7a1baa5ae13212abd32360614f3eeeffd3bb6b644d71840fe8a39b4ee" Oct 13 19:58:23 crc kubenswrapper[4974]: E1013 19:58:23.632777 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef339bb7a1baa5ae13212abd32360614f3eeeffd3bb6b644d71840fe8a39b4ee\": container with ID starting with ef339bb7a1baa5ae13212abd32360614f3eeeffd3bb6b644d71840fe8a39b4ee not found: ID does not exist" containerID="ef339bb7a1baa5ae13212abd32360614f3eeeffd3bb6b644d71840fe8a39b4ee" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.632799 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef339bb7a1baa5ae13212abd32360614f3eeeffd3bb6b644d71840fe8a39b4ee"} err="failed to get container status \"ef339bb7a1baa5ae13212abd32360614f3eeeffd3bb6b644d71840fe8a39b4ee\": rpc error: code = NotFound desc = could not find container \"ef339bb7a1baa5ae13212abd32360614f3eeeffd3bb6b644d71840fe8a39b4ee\": container with ID starting with ef339bb7a1baa5ae13212abd32360614f3eeeffd3bb6b644d71840fe8a39b4ee not found: ID does not exist" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.632814 4974 scope.go:117] "RemoveContainer" containerID="33534e6c4eb0735505b6de2374190a733c50cf4d6fcc80f959ba2e8ad75eeb86" Oct 13 19:58:23 crc kubenswrapper[4974]: E1013 19:58:23.633075 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33534e6c4eb0735505b6de2374190a733c50cf4d6fcc80f959ba2e8ad75eeb86\": container with ID starting with 33534e6c4eb0735505b6de2374190a733c50cf4d6fcc80f959ba2e8ad75eeb86 not found: ID does not exist" containerID="33534e6c4eb0735505b6de2374190a733c50cf4d6fcc80f959ba2e8ad75eeb86" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.633097 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33534e6c4eb0735505b6de2374190a733c50cf4d6fcc80f959ba2e8ad75eeb86"} err="failed to get container status \"33534e6c4eb0735505b6de2374190a733c50cf4d6fcc80f959ba2e8ad75eeb86\": rpc error: code = NotFound desc = could not find container \"33534e6c4eb0735505b6de2374190a733c50cf4d6fcc80f959ba2e8ad75eeb86\": container with ID starting with 33534e6c4eb0735505b6de2374190a733c50cf4d6fcc80f959ba2e8ad75eeb86 not found: ID does not exist" Oct 13 19:58:23 crc kubenswrapper[4974]: I1013 19:58:23.822487 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1a5dacf-0328-45cb-a9e4-98698a097c89" path="/var/lib/kubelet/pods/b1a5dacf-0328-45cb-a9e4-98698a097c89/volumes" Oct 13 19:58:24 crc kubenswrapper[4974]: I1013 19:58:24.399594 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2ccrt"] Oct 13 19:58:25 crc kubenswrapper[4974]: I1013 19:58:25.178487 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/util/0.log" Oct 13 19:58:25 crc kubenswrapper[4974]: I1013 19:58:25.390827 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/pull/0.log" Oct 13 19:58:25 crc kubenswrapper[4974]: I1013 19:58:25.412348 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/pull/0.log" Oct 13 19:58:25 crc kubenswrapper[4974]: I1013 19:58:25.421320 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/util/0.log" Oct 13 19:58:25 crc kubenswrapper[4974]: I1013 19:58:25.482961 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2ccrt" podUID="377bd5f5-b4ae-43ea-b675-9c88662194b4" containerName="registry-server" containerID="cri-o://5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208" gracePeriod=2 Oct 13 19:58:25 crc kubenswrapper[4974]: I1013 19:58:25.565450 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/pull/0.log" Oct 13 19:58:25 crc kubenswrapper[4974]: I1013 19:58:25.602975 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/util/0.log" Oct 13 19:58:25 crc kubenswrapper[4974]: I1013 19:58:25.630929 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7b28b73898c2de401ad8347f73e60cfa278cffeb7e6a7707f86b1db2617rkv8_fb93a5ae-368f-4fce-b522-b318fa519ade/extract/0.log" Oct 13 19:58:25 crc kubenswrapper[4974]: I1013 19:58:25.807569 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-hvqbv_ef8af802-f6f6-4018-9bfd-f8aee92ff838/kube-rbac-proxy/0.log" Oct 13 19:58:25 crc kubenswrapper[4974]: I1013 19:58:25.871715 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-64f84fcdbb-hvqbv_ef8af802-f6f6-4018-9bfd-f8aee92ff838/manager/0.log" Oct 13 19:58:25 crc kubenswrapper[4974]: I1013 19:58:25.906994 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-t2hfb_b44da60c-a4d1-406d-abb8-db29314b9e50/kube-rbac-proxy/0.log" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.002321 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.138460 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377bd5f5-b4ae-43ea-b675-9c88662194b4-catalog-content\") pod \"377bd5f5-b4ae-43ea-b675-9c88662194b4\" (UID: \"377bd5f5-b4ae-43ea-b675-9c88662194b4\") " Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.138583 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377bd5f5-b4ae-43ea-b675-9c88662194b4-utilities\") pod \"377bd5f5-b4ae-43ea-b675-9c88662194b4\" (UID: \"377bd5f5-b4ae-43ea-b675-9c88662194b4\") " Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.138719 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s87m2\" (UniqueName: \"kubernetes.io/projected/377bd5f5-b4ae-43ea-b675-9c88662194b4-kube-api-access-s87m2\") pod \"377bd5f5-b4ae-43ea-b675-9c88662194b4\" (UID: \"377bd5f5-b4ae-43ea-b675-9c88662194b4\") " Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.141776 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/377bd5f5-b4ae-43ea-b675-9c88662194b4-utilities" (OuterVolumeSpecName: "utilities") pod "377bd5f5-b4ae-43ea-b675-9c88662194b4" (UID: "377bd5f5-b4ae-43ea-b675-9c88662194b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.146921 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/377bd5f5-b4ae-43ea-b675-9c88662194b4-kube-api-access-s87m2" (OuterVolumeSpecName: "kube-api-access-s87m2") pod "377bd5f5-b4ae-43ea-b675-9c88662194b4" (UID: "377bd5f5-b4ae-43ea-b675-9c88662194b4"). InnerVolumeSpecName "kube-api-access-s87m2". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.159611 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/377bd5f5-b4ae-43ea-b675-9c88662194b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "377bd5f5-b4ae-43ea-b675-9c88662194b4" (UID: "377bd5f5-b4ae-43ea-b675-9c88662194b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.240789 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s87m2\" (UniqueName: \"kubernetes.io/projected/377bd5f5-b4ae-43ea-b675-9c88662194b4-kube-api-access-s87m2\") on node \"crc\" DevicePath \"\"" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.240824 4974 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/377bd5f5-b4ae-43ea-b675-9c88662194b4-catalog-content\") on node \"crc\" DevicePath \"\"" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.240837 4974 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/377bd5f5-b4ae-43ea-b675-9c88662194b4-utilities\") on node \"crc\" DevicePath \"\"" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.272321 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-59cdc64769-t2hfb_b44da60c-a4d1-406d-abb8-db29314b9e50/manager/0.log" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.329427 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-h2cmd_758864e5-2a90-496e-b006-dcfaf42c20bb/kube-rbac-proxy/0.log" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.357590 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-687df44cdb-h2cmd_758864e5-2a90-496e-b006-dcfaf42c20bb/manager/0.log" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.495760 4974 generic.go:334] "Generic (PLEG): container finished" podID="377bd5f5-b4ae-43ea-b675-9c88662194b4" containerID="5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208" exitCode=0 Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.495803 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2ccrt" event={"ID":"377bd5f5-b4ae-43ea-b675-9c88662194b4","Type":"ContainerDied","Data":"5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208"} Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.495832 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2ccrt" event={"ID":"377bd5f5-b4ae-43ea-b675-9c88662194b4","Type":"ContainerDied","Data":"29947529519b6486f122bb46b86f3014df2ee02cbaef7d98c3123dae99e2e142"} Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.495855 4974 scope.go:117] "RemoveContainer" containerID="5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.495974 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2ccrt" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.536642 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-wzvwc_78805c21-d9b5-4f77-a318-fa1dfa26ebc3/kube-rbac-proxy/0.log" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.543750 4974 scope.go:117] "RemoveContainer" containerID="c67f59501a41881f779011104fed2fc4ab786355ae8c7052f0fda9d23bdb0526" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.583377 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2ccrt"] Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.619006 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2ccrt"] Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.625486 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-7bb46cd7d-wzvwc_78805c21-d9b5-4f77-a318-fa1dfa26ebc3/manager/0.log" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.626080 4974 scope.go:117] "RemoveContainer" containerID="732a05608c63d0d4941a7d220e56cc1aa65f7607c45a26d11220a2996d6a5632" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.671865 4974 scope.go:117] "RemoveContainer" containerID="5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208" Oct 13 19:58:26 crc kubenswrapper[4974]: E1013 19:58:26.675756 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208\": container with ID starting with 5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208 not found: ID does not exist" containerID="5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.675794 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208"} err="failed to get container status \"5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208\": rpc error: code = NotFound desc = could not find container \"5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208\": container with ID starting with 5688341f406358415a2731e4a9d4d68a2f08c40d44beba1811f9f5d5f9f63208 not found: ID does not exist" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.675816 4974 scope.go:117] "RemoveContainer" containerID="c67f59501a41881f779011104fed2fc4ab786355ae8c7052f0fda9d23bdb0526" Oct 13 19:58:26 crc kubenswrapper[4974]: E1013 19:58:26.677046 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c67f59501a41881f779011104fed2fc4ab786355ae8c7052f0fda9d23bdb0526\": container with ID starting with c67f59501a41881f779011104fed2fc4ab786355ae8c7052f0fda9d23bdb0526 not found: ID does not exist" containerID="c67f59501a41881f779011104fed2fc4ab786355ae8c7052f0fda9d23bdb0526" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.677186 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c67f59501a41881f779011104fed2fc4ab786355ae8c7052f0fda9d23bdb0526"} err="failed to get container status \"c67f59501a41881f779011104fed2fc4ab786355ae8c7052f0fda9d23bdb0526\": rpc error: code = NotFound desc = could not find container \"c67f59501a41881f779011104fed2fc4ab786355ae8c7052f0fda9d23bdb0526\": container with ID starting with c67f59501a41881f779011104fed2fc4ab786355ae8c7052f0fda9d23bdb0526 not found: ID does not exist" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.677290 4974 scope.go:117] "RemoveContainer" containerID="732a05608c63d0d4941a7d220e56cc1aa65f7607c45a26d11220a2996d6a5632" Oct 13 19:58:26 crc kubenswrapper[4974]: E1013 19:58:26.678516 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"732a05608c63d0d4941a7d220e56cc1aa65f7607c45a26d11220a2996d6a5632\": container with ID starting with 732a05608c63d0d4941a7d220e56cc1aa65f7607c45a26d11220a2996d6a5632 not found: ID does not exist" containerID="732a05608c63d0d4941a7d220e56cc1aa65f7607c45a26d11220a2996d6a5632" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.678598 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"732a05608c63d0d4941a7d220e56cc1aa65f7607c45a26d11220a2996d6a5632"} err="failed to get container status \"732a05608c63d0d4941a7d220e56cc1aa65f7607c45a26d11220a2996d6a5632\": rpc error: code = NotFound desc = could not find container \"732a05608c63d0d4941a7d220e56cc1aa65f7607c45a26d11220a2996d6a5632\": container with ID starting with 732a05608c63d0d4941a7d220e56cc1aa65f7607c45a26d11220a2996d6a5632 not found: ID does not exist" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.715438 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-n4n2k_50ce5538-ff95-4983-8ff7-3a406b974617/manager/0.log" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.729014 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-6d9967f8dd-n4n2k_50ce5538-ff95-4983-8ff7-3a406b974617/kube-rbac-proxy/0.log" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.832002 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-cp96l_f332d432-86f0-4c0b-80d6-dba6e2920a81/kube-rbac-proxy/0.log" Oct 13 19:58:26 crc kubenswrapper[4974]: I1013 19:58:26.910860 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d74794d9b-cp96l_f332d432-86f0-4c0b-80d6-dba6e2920a81/manager/0.log" Oct 13 19:58:27 crc kubenswrapper[4974]: I1013 19:58:27.040976 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-rzm52_e152664c-85e7-4854-8960-ee413a7eb3a3/kube-rbac-proxy/0.log" Oct 13 19:58:27 crc kubenswrapper[4974]: I1013 19:58:27.173090 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-w4lrj_e5d3e6f8-15bf-4544-b701-da591158af75/kube-rbac-proxy/0.log" Oct 13 19:58:27 crc kubenswrapper[4974]: I1013 19:58:27.273663 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-585fc5b659-rzm52_e152664c-85e7-4854-8960-ee413a7eb3a3/manager/0.log" Oct 13 19:58:27 crc kubenswrapper[4974]: I1013 19:58:27.299616 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-74cb5cbc49-w4lrj_e5d3e6f8-15bf-4544-b701-da591158af75/manager/0.log" Oct 13 19:58:27 crc kubenswrapper[4974]: I1013 19:58:27.435229 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-hvwzb_2b43f3c2-b280-40e9-9467-181a372011e1/kube-rbac-proxy/0.log" Oct 13 19:58:27 crc kubenswrapper[4974]: I1013 19:58:27.531299 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-ddb98f99b-hvwzb_2b43f3c2-b280-40e9-9467-181a372011e1/manager/0.log" Oct 13 19:58:27 crc kubenswrapper[4974]: I1013 19:58:27.564497 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-zkfrg_bcad591b-b126-4da8-a21c-636d710329b8/kube-rbac-proxy/0.log" Oct 13 19:58:27 crc kubenswrapper[4974]: I1013 19:58:27.724312 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-59578bc799-zkfrg_bcad591b-b126-4da8-a21c-636d710329b8/manager/0.log" Oct 13 19:58:27 crc kubenswrapper[4974]: I1013 19:58:27.775950 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-gq4sm_c10ae245-c899-4ea9-9edb-d62b176d19cc/kube-rbac-proxy/0.log" Oct 13 19:58:27 crc kubenswrapper[4974]: I1013 19:58:27.822830 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="377bd5f5-b4ae-43ea-b675-9c88662194b4" path="/var/lib/kubelet/pods/377bd5f5-b4ae-43ea-b675-9c88662194b4/volumes" Oct 13 19:58:27 crc kubenswrapper[4974]: I1013 19:58:27.836446 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5777b4f897-gq4sm_c10ae245-c899-4ea9-9edb-d62b176d19cc/manager/0.log" Oct 13 19:58:27 crc kubenswrapper[4974]: I1013 19:58:27.955435 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-7p29r_86f89f48-3e17-4ed9-9cbb-6458223a1864/kube-rbac-proxy/0.log" Oct 13 19:58:28 crc kubenswrapper[4974]: I1013 19:58:28.007592 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-797d478b46-7p29r_86f89f48-3e17-4ed9-9cbb-6458223a1864/manager/0.log" Oct 13 19:58:28 crc kubenswrapper[4974]: I1013 19:58:28.144247 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-2xgsp_197d51a8-e30e-485c-8e76-bd4ee120da7b/kube-rbac-proxy/0.log" Oct 13 19:58:28 crc kubenswrapper[4974]: I1013 19:58:28.282871 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-57bb74c7bf-2xgsp_197d51a8-e30e-485c-8e76-bd4ee120da7b/manager/0.log" Oct 13 19:58:28 crc kubenswrapper[4974]: I1013 19:58:28.295907 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-zx7qh_923ead90-d60a-431b-9630-693bdc007237/kube-rbac-proxy/0.log" Oct 13 19:58:28 crc kubenswrapper[4974]: I1013 19:58:28.384191 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-6d7c7ddf95-zx7qh_923ead90-d60a-431b-9630-693bdc007237/manager/0.log" Oct 13 19:58:28 crc kubenswrapper[4974]: I1013 19:58:28.463550 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w_d95330d6-c9ed-4fe6-8daa-6ef9495e72ae/kube-rbac-proxy/0.log" Oct 13 19:58:28 crc kubenswrapper[4974]: I1013 19:58:28.510244 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6cc7fb757d2qz9w_d95330d6-c9ed-4fe6-8daa-6ef9495e72ae/manager/0.log" Oct 13 19:58:28 crc kubenswrapper[4974]: I1013 19:58:28.742872 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7995b9c57f-x4jst_92f85149-41d6-471d-8d77-25fdafb20ca2/kube-rbac-proxy/0.log" Oct 13 19:58:28 crc kubenswrapper[4974]: I1013 19:58:28.929160 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8d8df4487-k7bh6_72ade784-9a52-4442-b3e6-044297f70cb7/kube-rbac-proxy/0.log" Oct 13 19:58:29 crc kubenswrapper[4974]: I1013 19:58:29.198013 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-8d8df4487-k7bh6_72ade784-9a52-4442-b3e6-044297f70cb7/operator/0.log" Oct 13 19:58:29 crc kubenswrapper[4974]: I1013 19:58:29.256306 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zdv77_208151b9-4d45-4a71-9417-5082f935fd8b/registry-server/0.log" Oct 13 19:58:29 crc kubenswrapper[4974]: I1013 19:58:29.464806 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-8ftd2_f9ed9202-2a09-42d2-b140-8300e108e36a/kube-rbac-proxy/0.log" Oct 13 19:58:29 crc kubenswrapper[4974]: I1013 19:58:29.591779 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-869cc7797f-8ftd2_f9ed9202-2a09-42d2-b140-8300e108e36a/manager/0.log" Oct 13 19:58:29 crc kubenswrapper[4974]: I1013 19:58:29.704249 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-rs4rf_4cbd873e-490d-4f1c-91cc-4ca45f109d7f/kube-rbac-proxy/0.log" Oct 13 19:58:29 crc kubenswrapper[4974]: I1013 19:58:29.755392 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-664664cb68-rs4rf_4cbd873e-490d-4f1c-91cc-4ca45f109d7f/manager/0.log" Oct 13 19:58:29 crc kubenswrapper[4974]: I1013 19:58:29.805308 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-7995b9c57f-x4jst_92f85149-41d6-471d-8d77-25fdafb20ca2/manager/0.log" Oct 13 19:58:29 crc kubenswrapper[4974]: I1013 19:58:29.933048 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-5f97d8c699-scqj9_9a259044-9901-4a97-89f7-965118976af7/operator/0.log" Oct 13 19:58:30 crc kubenswrapper[4974]: I1013 19:58:30.007293 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-mj7kp_45bbc336-9feb-40e0-b7a9-92fad85e7396/manager/0.log" Oct 13 19:58:30 crc kubenswrapper[4974]: I1013 19:58:30.022760 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f4d5dfdc6-mj7kp_45bbc336-9feb-40e0-b7a9-92fad85e7396/kube-rbac-proxy/0.log" Oct 13 19:58:30 crc kubenswrapper[4974]: I1013 19:58:30.126105 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-2q9c8_7269886e-6ad1-43fe-a8f2-c535dffe836c/kube-rbac-proxy/0.log" Oct 13 19:58:30 crc kubenswrapper[4974]: I1013 19:58:30.194898 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-lb8ln_e6e02a94-3239-4e8b-8d87-4adb4ebcc98b/kube-rbac-proxy/0.log" Oct 13 19:58:30 crc kubenswrapper[4974]: I1013 19:58:30.273557 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-ffcdd6c94-lb8ln_e6e02a94-3239-4e8b-8d87-4adb4ebcc98b/manager/0.log" Oct 13 19:58:30 crc kubenswrapper[4974]: I1013 19:58:30.375197 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6f64d8b78-d6wjd_7bb571d8-3894-46f5-a627-932b5dfdc2fd/kube-rbac-proxy/0.log" Oct 13 19:58:30 crc kubenswrapper[4974]: I1013 19:58:30.449062 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-578874c84d-2q9c8_7269886e-6ad1-43fe-a8f2-c535dffe836c/manager/0.log" Oct 13 19:58:30 crc kubenswrapper[4974]: I1013 19:58:30.550752 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6f64d8b78-d6wjd_7bb571d8-3894-46f5-a627-932b5dfdc2fd/manager/0.log" Oct 13 19:58:35 crc kubenswrapper[4974]: I1013 19:58:35.822245 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:58:35 crc kubenswrapper[4974]: E1013 19:58:35.823232 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:58:47 crc kubenswrapper[4974]: I1013 19:58:47.888145 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-h6bwk_78a93fc9-5305-44ea-a573-3e54bd52f22d/control-plane-machine-set-operator/0.log" Oct 13 19:58:48 crc kubenswrapper[4974]: I1013 19:58:48.072686 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-knlnk_bc5f4140-1f56-472e-95ed-cf3d4fb85f45/kube-rbac-proxy/0.log" Oct 13 19:58:48 crc kubenswrapper[4974]: I1013 19:58:48.110588 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-knlnk_bc5f4140-1f56-472e-95ed-cf3d4fb85f45/machine-api-operator/0.log" Oct 13 19:58:50 crc kubenswrapper[4974]: I1013 19:58:50.811587 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:58:50 crc kubenswrapper[4974]: E1013 19:58:50.812329 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:59:00 crc kubenswrapper[4974]: I1013 19:59:00.988514 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-79kfk_37764aab-fdaf-4d54-8afc-f2788411ff07/cert-manager-controller/0.log" Oct 13 19:59:01 crc kubenswrapper[4974]: I1013 19:59:01.166817 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-pbvz7_70f7d9f3-cbb4-4009-9feb-89a4eb2bbf95/cert-manager-cainjector/0.log" Oct 13 19:59:01 crc kubenswrapper[4974]: I1013 19:59:01.208042 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-lhmgd_88a02d7c-89e6-464c-b519-aeb3fe4dfda3/cert-manager-webhook/0.log" Oct 13 19:59:01 crc kubenswrapper[4974]: I1013 19:59:01.811950 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:59:01 crc kubenswrapper[4974]: E1013 19:59:01.812206 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:59:13 crc kubenswrapper[4974]: I1013 19:59:13.771332 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-6b874cbd85-rx8hz_4d9c8026-13ca-4df7-8bfc-d36594573e26/nmstate-console-plugin/0.log" Oct 13 19:59:13 crc kubenswrapper[4974]: I1013 19:59:13.812102 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:59:13 crc kubenswrapper[4974]: E1013 19:59:13.812364 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:59:13 crc kubenswrapper[4974]: I1013 19:59:13.963351 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-zg86t_16f35fcb-c349-4306-a4ee-306dfff9a8f1/kube-rbac-proxy/0.log" Oct 13 19:59:13 crc kubenswrapper[4974]: I1013 19:59:13.967272 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-skb5t_51e7a340-7c32-4fae-b22f-2dd321f0afc1/nmstate-handler/0.log" Oct 13 19:59:13 crc kubenswrapper[4974]: I1013 19:59:13.997075 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-fdff9cb8d-zg86t_16f35fcb-c349-4306-a4ee-306dfff9a8f1/nmstate-metrics/0.log" Oct 13 19:59:14 crc kubenswrapper[4974]: I1013 19:59:14.169674 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-858ddd8f98-5cgww_b8f48940-751e-4e3c-98a3-d29c1b73e776/nmstate-operator/0.log" Oct 13 19:59:14 crc kubenswrapper[4974]: I1013 19:59:14.183837 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6cdbc54649-rkbf2_ded07897-9b9c-4548-a909-02c623167912/nmstate-webhook/0.log" Oct 13 19:59:24 crc kubenswrapper[4974]: I1013 19:59:24.811562 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:59:24 crc kubenswrapper[4974]: E1013 19:59:24.812596 4974 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-xpb6b_openshift-machine-config-operator(013d968f-6cef-476b-a6fc-88d396bd5cd1)\"" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.101642 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-2t9km_77a8d6d5-aa09-4168-8c4f-228849d999e2/kube-rbac-proxy/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.222932 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-frr-files/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.254080 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-68d546b9d8-2t9km_77a8d6d5-aa09-4168-8c4f-228849d999e2/controller/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.399790 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-frr-files/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.426347 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-metrics/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.457487 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-reloader/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.499770 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-reloader/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.617930 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-frr-files/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.645601 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-metrics/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.646640 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-reloader/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.664603 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-metrics/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.864796 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-metrics/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.865132 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-frr-files/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.869591 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/controller/0.log" Oct 13 19:59:28 crc kubenswrapper[4974]: I1013 19:59:28.889763 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/cp-reloader/0.log" Oct 13 19:59:29 crc kubenswrapper[4974]: I1013 19:59:29.057124 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/frr-metrics/0.log" Oct 13 19:59:29 crc kubenswrapper[4974]: I1013 19:59:29.070699 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/kube-rbac-proxy/0.log" Oct 13 19:59:29 crc kubenswrapper[4974]: I1013 19:59:29.113311 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/kube-rbac-proxy-frr/0.log" Oct 13 19:59:29 crc kubenswrapper[4974]: I1013 19:59:29.304829 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/reloader/0.log" Oct 13 19:59:29 crc kubenswrapper[4974]: I1013 19:59:29.322009 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-64bf5d555-t5brv_b014985a-51e5-494a-a16b-c126e6fce6b3/frr-k8s-webhook-server/0.log" Oct 13 19:59:29 crc kubenswrapper[4974]: I1013 19:59:29.524902 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-fd8c579fc-kgnkv_b51360f9-c2df-4940-8a9f-91bd9287605c/manager/0.log" Oct 13 19:59:29 crc kubenswrapper[4974]: I1013 19:59:29.727399 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-9c5d5965c-l9f5q_ca62cae8-3dc3-492d-aa06-59d085da2253/webhook-server/0.log" Oct 13 19:59:29 crc kubenswrapper[4974]: I1013 19:59:29.766313 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lc487_6d0e7abb-aa57-48af-9a9a-d3c626b9131a/kube-rbac-proxy/0.log" Oct 13 19:59:30 crc kubenswrapper[4974]: I1013 19:59:30.387292 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lc487_6d0e7abb-aa57-48af-9a9a-d3c626b9131a/speaker/0.log" Oct 13 19:59:30 crc kubenswrapper[4974]: I1013 19:59:30.837719 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w9rrg_0842201f-d8df-4376-b130-4bc0c560dc37/frr/0.log" Oct 13 19:59:39 crc kubenswrapper[4974]: I1013 19:59:39.811630 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 19:59:40 crc kubenswrapper[4974]: I1013 19:59:40.322484 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"fc2b28bf2a1af49a262232e7c9a21172aeb3f7c941e6129ac71d99b29609819a"} Oct 13 19:59:42 crc kubenswrapper[4974]: I1013 19:59:42.913924 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/util/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.051228 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/pull/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.063273 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/util/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.121904 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/pull/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.280206 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/extract/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.288165 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/util/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.291022 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8f2f4ee801e5826a37d84a7b1fc4ccbf6b79de668302737d0f1152d8d2lrwpk_43f2dc64-d1da-4f5e-b7e6-f343259f5784/pull/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.419087 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/util/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.611395 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/pull/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.628589 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/pull/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.630055 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/util/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.806625 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/pull/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.829434 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/extract/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.845448 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkbdcs_9250202b-1093-4e24-a2c7-0c907f458986/util/0.log" Oct 13 19:59:43 crc kubenswrapper[4974]: I1013 19:59:43.972664 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/extract-utilities/0.log" Oct 13 19:59:44 crc kubenswrapper[4974]: I1013 19:59:44.144020 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/extract-utilities/0.log" Oct 13 19:59:44 crc kubenswrapper[4974]: I1013 19:59:44.159257 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/extract-content/0.log" Oct 13 19:59:44 crc kubenswrapper[4974]: I1013 19:59:44.185862 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/extract-content/0.log" Oct 13 19:59:44 crc kubenswrapper[4974]: I1013 19:59:44.344960 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/extract-utilities/0.log" Oct 13 19:59:44 crc kubenswrapper[4974]: I1013 19:59:44.358391 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/extract-content/0.log" Oct 13 19:59:44 crc kubenswrapper[4974]: I1013 19:59:44.586267 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/extract-utilities/0.log" Oct 13 19:59:44 crc kubenswrapper[4974]: I1013 19:59:44.690668 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r4mpc_b0812172-6c40-43af-af1b-1e5a90ae8fe8/registry-server/0.log" Oct 13 19:59:44 crc kubenswrapper[4974]: I1013 19:59:44.801197 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/extract-content/0.log" Oct 13 19:59:44 crc kubenswrapper[4974]: I1013 19:59:44.804261 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/extract-utilities/0.log" Oct 13 19:59:44 crc kubenswrapper[4974]: I1013 19:59:44.833592 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/extract-content/0.log" Oct 13 19:59:45 crc kubenswrapper[4974]: I1013 19:59:45.049232 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/extract-utilities/0.log" Oct 13 19:59:45 crc kubenswrapper[4974]: I1013 19:59:45.094890 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/extract-content/0.log" Oct 13 19:59:45 crc kubenswrapper[4974]: I1013 19:59:45.296889 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/util/0.log" Oct 13 19:59:45 crc kubenswrapper[4974]: I1013 19:59:45.475992 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/pull/0.log" Oct 13 19:59:45 crc kubenswrapper[4974]: I1013 19:59:45.489959 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/util/0.log" Oct 13 19:59:45 crc kubenswrapper[4974]: I1013 19:59:45.536347 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/pull/0.log" Oct 13 19:59:45 crc kubenswrapper[4974]: I1013 19:59:45.784309 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/pull/0.log" Oct 13 19:59:45 crc kubenswrapper[4974]: I1013 19:59:45.788769 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/extract/0.log" Oct 13 19:59:45 crc kubenswrapper[4974]: I1013 19:59:45.822415 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_fa9831ede5d93c33d525b70ce6ddf94e500d80992af75a3305fe98835c89gwt_40c8dc6f-22e8-44bb-818f-1861ac1566cf/util/0.log" Oct 13 19:59:45 crc kubenswrapper[4974]: I1013 19:59:45.977848 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-llxfj_b21f08a9-cf8a-4598-8b0a-f43015102fc6/registry-server/0.log" Oct 13 19:59:45 crc kubenswrapper[4974]: I1013 19:59:45.983693 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-jvrtp_09e0a416-6821-4853-8c22-d5e55e540657/marketplace-operator/0.log" Oct 13 19:59:46 crc kubenswrapper[4974]: I1013 19:59:46.176883 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/extract-utilities/0.log" Oct 13 19:59:46 crc kubenswrapper[4974]: I1013 19:59:46.327788 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/extract-content/0.log" Oct 13 19:59:46 crc kubenswrapper[4974]: I1013 19:59:46.375761 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/extract-utilities/0.log" Oct 13 19:59:46 crc kubenswrapper[4974]: I1013 19:59:46.389087 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/extract-content/0.log" Oct 13 19:59:46 crc kubenswrapper[4974]: I1013 19:59:46.567985 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/extract-content/0.log" Oct 13 19:59:46 crc kubenswrapper[4974]: I1013 19:59:46.587867 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/extract-utilities/0.log" Oct 13 19:59:46 crc kubenswrapper[4974]: I1013 19:59:46.633639 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/extract-utilities/0.log" Oct 13 19:59:46 crc kubenswrapper[4974]: I1013 19:59:46.708257 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6jsrw_b9b28cad-3016-495e-b2cb-33b07dcc4d2d/registry-server/0.log" Oct 13 19:59:46 crc kubenswrapper[4974]: I1013 19:59:46.788486 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/extract-utilities/0.log" Oct 13 19:59:46 crc kubenswrapper[4974]: I1013 19:59:46.815112 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/extract-content/0.log" Oct 13 19:59:46 crc kubenswrapper[4974]: I1013 19:59:46.860064 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/extract-content/0.log" Oct 13 19:59:47 crc kubenswrapper[4974]: I1013 19:59:47.025738 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/extract-utilities/0.log" Oct 13 19:59:47 crc kubenswrapper[4974]: I1013 19:59:47.025779 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/extract-content/0.log" Oct 13 19:59:47 crc kubenswrapper[4974]: I1013 19:59:47.796421 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-8dgp2_2d8eecf7-a8ba-4155-bad2-1931a61fd0e8/registry-server/0.log" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.102176 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-qc6nj_bf1f0f19-a1c6-4d16-8876-a70c018e0452/prometheus-operator/0.log" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.138828 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh"] Oct 13 20:00:00 crc kubenswrapper[4974]: E1013 20:00:00.139253 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a5dacf-0328-45cb-a9e4-98698a097c89" containerName="registry-server" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.139277 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a5dacf-0328-45cb-a9e4-98698a097c89" containerName="registry-server" Oct 13 20:00:00 crc kubenswrapper[4974]: E1013 20:00:00.139288 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377bd5f5-b4ae-43ea-b675-9c88662194b4" containerName="registry-server" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.139295 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="377bd5f5-b4ae-43ea-b675-9c88662194b4" containerName="registry-server" Oct 13 20:00:00 crc kubenswrapper[4974]: E1013 20:00:00.139317 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a5dacf-0328-45cb-a9e4-98698a097c89" containerName="extract-content" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.139325 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a5dacf-0328-45cb-a9e4-98698a097c89" containerName="extract-content" Oct 13 20:00:00 crc kubenswrapper[4974]: E1013 20:00:00.139359 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377bd5f5-b4ae-43ea-b675-9c88662194b4" containerName="extract-utilities" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.139367 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="377bd5f5-b4ae-43ea-b675-9c88662194b4" containerName="extract-utilities" Oct 13 20:00:00 crc kubenswrapper[4974]: E1013 20:00:00.139384 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="377bd5f5-b4ae-43ea-b675-9c88662194b4" containerName="extract-content" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.139392 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="377bd5f5-b4ae-43ea-b675-9c88662194b4" containerName="extract-content" Oct 13 20:00:00 crc kubenswrapper[4974]: E1013 20:00:00.139404 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1a5dacf-0328-45cb-a9e4-98698a097c89" containerName="extract-utilities" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.139412 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1a5dacf-0328-45cb-a9e4-98698a097c89" containerName="extract-utilities" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.139664 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="377bd5f5-b4ae-43ea-b675-9c88662194b4" containerName="registry-server" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.139698 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1a5dacf-0328-45cb-a9e4-98698a097c89" containerName="registry-server" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.140549 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.142328 4974 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.142709 4974 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.217280 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh"] Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.318413 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67dae90e-0fc4-4bf6-a161-3189d29d9865-config-volume\") pod \"collect-profiles-29339760-s5snh\" (UID: \"67dae90e-0fc4-4bf6-a161-3189d29d9865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.318705 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67dae90e-0fc4-4bf6-a161-3189d29d9865-secret-volume\") pod \"collect-profiles-29339760-s5snh\" (UID: \"67dae90e-0fc4-4bf6-a161-3189d29d9865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.318765 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5mc9\" (UniqueName: \"kubernetes.io/projected/67dae90e-0fc4-4bf6-a161-3189d29d9865-kube-api-access-m5mc9\") pod \"collect-profiles-29339760-s5snh\" (UID: \"67dae90e-0fc4-4bf6-a161-3189d29d9865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.348897 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67d85487b6-5hftg_33d95548-42f2-4bde-88eb-23cfd6a5c5c0/prometheus-operator-admission-webhook/0.log" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.418781 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67d85487b6-qpjpt_cffd65cb-eb29-48d8-b634-4e535b39ce51/prometheus-operator-admission-webhook/0.log" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.420080 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67dae90e-0fc4-4bf6-a161-3189d29d9865-config-volume\") pod \"collect-profiles-29339760-s5snh\" (UID: \"67dae90e-0fc4-4bf6-a161-3189d29d9865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.420141 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67dae90e-0fc4-4bf6-a161-3189d29d9865-secret-volume\") pod \"collect-profiles-29339760-s5snh\" (UID: \"67dae90e-0fc4-4bf6-a161-3189d29d9865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.420195 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5mc9\" (UniqueName: \"kubernetes.io/projected/67dae90e-0fc4-4bf6-a161-3189d29d9865-kube-api-access-m5mc9\") pod \"collect-profiles-29339760-s5snh\" (UID: \"67dae90e-0fc4-4bf6-a161-3189d29d9865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.421288 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67dae90e-0fc4-4bf6-a161-3189d29d9865-config-volume\") pod \"collect-profiles-29339760-s5snh\" (UID: \"67dae90e-0fc4-4bf6-a161-3189d29d9865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.432278 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67dae90e-0fc4-4bf6-a161-3189d29d9865-secret-volume\") pod \"collect-profiles-29339760-s5snh\" (UID: \"67dae90e-0fc4-4bf6-a161-3189d29d9865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.437088 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5mc9\" (UniqueName: \"kubernetes.io/projected/67dae90e-0fc4-4bf6-a161-3189d29d9865-kube-api-access-m5mc9\") pod \"collect-profiles-29339760-s5snh\" (UID: \"67dae90e-0fc4-4bf6-a161-3189d29d9865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.543074 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.566179 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-zqdvp_adeddb78-abbb-494b-b723-d3ed7a66503f/operator/0.log" Oct 13 20:00:00 crc kubenswrapper[4974]: I1013 20:00:00.638641 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-8mc8w_bcccc6ab-47aa-4c17-88e0-fb0ed3ea5471/perses-operator/0.log" Oct 13 20:00:01 crc kubenswrapper[4974]: I1013 20:00:01.023493 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh"] Oct 13 20:00:01 crc kubenswrapper[4974]: I1013 20:00:01.558446 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" event={"ID":"67dae90e-0fc4-4bf6-a161-3189d29d9865","Type":"ContainerStarted","Data":"a789accf97cfcb59d93021faf2c62e3c20d42d8e4b5aca388725c3f971818c17"} Oct 13 20:00:01 crc kubenswrapper[4974]: I1013 20:00:01.558487 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" event={"ID":"67dae90e-0fc4-4bf6-a161-3189d29d9865","Type":"ContainerStarted","Data":"5fa85c55d6a116984ded5856d5812905ad86560068b086db495946fa133a2ee7"} Oct 13 20:00:01 crc kubenswrapper[4974]: I1013 20:00:01.576007 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" podStartSLOduration=1.5759906369999999 podStartE2EDuration="1.575990637s" podCreationTimestamp="2025-10-13 20:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 20:00:01.570949945 +0000 UTC m=+6336.475316025" watchObservedRunningTime="2025-10-13 20:00:01.575990637 +0000 UTC m=+6336.480356707" Oct 13 20:00:02 crc kubenswrapper[4974]: I1013 20:00:02.571738 4974 generic.go:334] "Generic (PLEG): container finished" podID="67dae90e-0fc4-4bf6-a161-3189d29d9865" containerID="a789accf97cfcb59d93021faf2c62e3c20d42d8e4b5aca388725c3f971818c17" exitCode=0 Oct 13 20:00:02 crc kubenswrapper[4974]: I1013 20:00:02.571822 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" event={"ID":"67dae90e-0fc4-4bf6-a161-3189d29d9865","Type":"ContainerDied","Data":"a789accf97cfcb59d93021faf2c62e3c20d42d8e4b5aca388725c3f971818c17"} Oct 13 20:00:03 crc kubenswrapper[4974]: I1013 20:00:03.962195 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.104246 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67dae90e-0fc4-4bf6-a161-3189d29d9865-secret-volume\") pod \"67dae90e-0fc4-4bf6-a161-3189d29d9865\" (UID: \"67dae90e-0fc4-4bf6-a161-3189d29d9865\") " Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.104303 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5mc9\" (UniqueName: \"kubernetes.io/projected/67dae90e-0fc4-4bf6-a161-3189d29d9865-kube-api-access-m5mc9\") pod \"67dae90e-0fc4-4bf6-a161-3189d29d9865\" (UID: \"67dae90e-0fc4-4bf6-a161-3189d29d9865\") " Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.104594 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67dae90e-0fc4-4bf6-a161-3189d29d9865-config-volume\") pod \"67dae90e-0fc4-4bf6-a161-3189d29d9865\" (UID: \"67dae90e-0fc4-4bf6-a161-3189d29d9865\") " Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.105458 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67dae90e-0fc4-4bf6-a161-3189d29d9865-config-volume" (OuterVolumeSpecName: "config-volume") pod "67dae90e-0fc4-4bf6-a161-3189d29d9865" (UID: "67dae90e-0fc4-4bf6-a161-3189d29d9865"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.111366 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67dae90e-0fc4-4bf6-a161-3189d29d9865-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "67dae90e-0fc4-4bf6-a161-3189d29d9865" (UID: "67dae90e-0fc4-4bf6-a161-3189d29d9865"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.111856 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67dae90e-0fc4-4bf6-a161-3189d29d9865-kube-api-access-m5mc9" (OuterVolumeSpecName: "kube-api-access-m5mc9") pod "67dae90e-0fc4-4bf6-a161-3189d29d9865" (UID: "67dae90e-0fc4-4bf6-a161-3189d29d9865"). InnerVolumeSpecName "kube-api-access-m5mc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.207400 4974 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67dae90e-0fc4-4bf6-a161-3189d29d9865-config-volume\") on node \"crc\" DevicePath \"\"" Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.207428 4974 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/67dae90e-0fc4-4bf6-a161-3189d29d9865-secret-volume\") on node \"crc\" DevicePath \"\"" Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.207438 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5mc9\" (UniqueName: \"kubernetes.io/projected/67dae90e-0fc4-4bf6-a161-3189d29d9865-kube-api-access-m5mc9\") on node \"crc\" DevicePath \"\"" Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.594196 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" event={"ID":"67dae90e-0fc4-4bf6-a161-3189d29d9865","Type":"ContainerDied","Data":"5fa85c55d6a116984ded5856d5812905ad86560068b086db495946fa133a2ee7"} Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.594427 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5fa85c55d6a116984ded5856d5812905ad86560068b086db495946fa133a2ee7" Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.594444 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29339760-s5snh" Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.671240 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc"] Oct 13 20:00:04 crc kubenswrapper[4974]: I1013 20:00:04.679099 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29339715-zvdxc"] Oct 13 20:00:05 crc kubenswrapper[4974]: I1013 20:00:05.837621 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7733a86d-f657-47de-a8a6-50328f3a9392" path="/var/lib/kubelet/pods/7733a86d-f657-47de-a8a6-50328f3a9392/volumes" Oct 13 20:00:52 crc kubenswrapper[4974]: I1013 20:00:52.304753 4974 scope.go:117] "RemoveContainer" containerID="eab290ffa0f05f430f550444a0ffac86a4fe7888eb35c4c80f1ced0471ecfbc8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.179063 4974 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29339761-l2gb8"] Oct 13 20:01:00 crc kubenswrapper[4974]: E1013 20:01:00.180592 4974 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dae90e-0fc4-4bf6-a161-3189d29d9865" containerName="collect-profiles" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.180619 4974 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dae90e-0fc4-4bf6-a161-3189d29d9865" containerName="collect-profiles" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.181179 4974 memory_manager.go:354] "RemoveStaleState removing state" podUID="67dae90e-0fc4-4bf6-a161-3189d29d9865" containerName="collect-profiles" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.182495 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.208119 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29339761-l2gb8"] Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.313362 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7nls\" (UniqueName: \"kubernetes.io/projected/82a91310-26bd-4860-8186-7dcec58199ee-kube-api-access-j7nls\") pod \"keystone-cron-29339761-l2gb8\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.313422 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-combined-ca-bundle\") pod \"keystone-cron-29339761-l2gb8\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.313877 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-fernet-keys\") pod \"keystone-cron-29339761-l2gb8\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.314214 4974 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-config-data\") pod \"keystone-cron-29339761-l2gb8\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.415627 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-config-data\") pod \"keystone-cron-29339761-l2gb8\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.415738 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7nls\" (UniqueName: \"kubernetes.io/projected/82a91310-26bd-4860-8186-7dcec58199ee-kube-api-access-j7nls\") pod \"keystone-cron-29339761-l2gb8\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.415780 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-combined-ca-bundle\") pod \"keystone-cron-29339761-l2gb8\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.415955 4974 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-fernet-keys\") pod \"keystone-cron-29339761-l2gb8\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.422409 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-config-data\") pod \"keystone-cron-29339761-l2gb8\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.432277 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-combined-ca-bundle\") pod \"keystone-cron-29339761-l2gb8\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.434642 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-fernet-keys\") pod \"keystone-cron-29339761-l2gb8\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.454501 4974 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7nls\" (UniqueName: \"kubernetes.io/projected/82a91310-26bd-4860-8186-7dcec58199ee-kube-api-access-j7nls\") pod \"keystone-cron-29339761-l2gb8\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:00 crc kubenswrapper[4974]: I1013 20:01:00.506282 4974 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:01 crc kubenswrapper[4974]: I1013 20:01:01.152448 4974 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29339761-l2gb8"] Oct 13 20:01:01 crc kubenswrapper[4974]: I1013 20:01:01.295029 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339761-l2gb8" event={"ID":"82a91310-26bd-4860-8186-7dcec58199ee","Type":"ContainerStarted","Data":"65348956321c649b40052bf6950119e21b9e359bfc806e8792f85def65ffc206"} Oct 13 20:01:02 crc kubenswrapper[4974]: I1013 20:01:02.306733 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339761-l2gb8" event={"ID":"82a91310-26bd-4860-8186-7dcec58199ee","Type":"ContainerStarted","Data":"e04190916ef106970766c6a1b852ecc3398370f59f31cea3157a7886a0d84b3c"} Oct 13 20:01:02 crc kubenswrapper[4974]: I1013 20:01:02.341103 4974 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29339761-l2gb8" podStartSLOduration=2.341079456 podStartE2EDuration="2.341079456s" podCreationTimestamp="2025-10-13 20:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-10-13 20:01:02.331939859 +0000 UTC m=+6397.236305939" watchObservedRunningTime="2025-10-13 20:01:02.341079456 +0000 UTC m=+6397.245445546" Oct 13 20:01:05 crc kubenswrapper[4974]: I1013 20:01:05.353202 4974 generic.go:334] "Generic (PLEG): container finished" podID="82a91310-26bd-4860-8186-7dcec58199ee" containerID="e04190916ef106970766c6a1b852ecc3398370f59f31cea3157a7886a0d84b3c" exitCode=0 Oct 13 20:01:05 crc kubenswrapper[4974]: I1013 20:01:05.353299 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339761-l2gb8" event={"ID":"82a91310-26bd-4860-8186-7dcec58199ee","Type":"ContainerDied","Data":"e04190916ef106970766c6a1b852ecc3398370f59f31cea3157a7886a0d84b3c"} Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.775136 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.868006 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7nls\" (UniqueName: \"kubernetes.io/projected/82a91310-26bd-4860-8186-7dcec58199ee-kube-api-access-j7nls\") pod \"82a91310-26bd-4860-8186-7dcec58199ee\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.868299 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-config-data\") pod \"82a91310-26bd-4860-8186-7dcec58199ee\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.868570 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-combined-ca-bundle\") pod \"82a91310-26bd-4860-8186-7dcec58199ee\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.868879 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-fernet-keys\") pod \"82a91310-26bd-4860-8186-7dcec58199ee\" (UID: \"82a91310-26bd-4860-8186-7dcec58199ee\") " Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.890030 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82a91310-26bd-4860-8186-7dcec58199ee-kube-api-access-j7nls" (OuterVolumeSpecName: "kube-api-access-j7nls") pod "82a91310-26bd-4860-8186-7dcec58199ee" (UID: "82a91310-26bd-4860-8186-7dcec58199ee"). InnerVolumeSpecName "kube-api-access-j7nls". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.894493 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "82a91310-26bd-4860-8186-7dcec58199ee" (UID: "82a91310-26bd-4860-8186-7dcec58199ee"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.941398 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "82a91310-26bd-4860-8186-7dcec58199ee" (UID: "82a91310-26bd-4860-8186-7dcec58199ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.961349 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-config-data" (OuterVolumeSpecName: "config-data") pod "82a91310-26bd-4860-8186-7dcec58199ee" (UID: "82a91310-26bd-4860-8186-7dcec58199ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.971545 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7nls\" (UniqueName: \"kubernetes.io/projected/82a91310-26bd-4860-8186-7dcec58199ee-kube-api-access-j7nls\") on node \"crc\" DevicePath \"\"" Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.971578 4974 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-config-data\") on node \"crc\" DevicePath \"\"" Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.971592 4974 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Oct 13 20:01:06 crc kubenswrapper[4974]: I1013 20:01:06.971603 4974 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/82a91310-26bd-4860-8186-7dcec58199ee-fernet-keys\") on node \"crc\" DevicePath \"\"" Oct 13 20:01:07 crc kubenswrapper[4974]: I1013 20:01:07.385200 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29339761-l2gb8" event={"ID":"82a91310-26bd-4860-8186-7dcec58199ee","Type":"ContainerDied","Data":"65348956321c649b40052bf6950119e21b9e359bfc806e8792f85def65ffc206"} Oct 13 20:01:07 crc kubenswrapper[4974]: I1013 20:01:07.385265 4974 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65348956321c649b40052bf6950119e21b9e359bfc806e8792f85def65ffc206" Oct 13 20:01:07 crc kubenswrapper[4974]: I1013 20:01:07.385353 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29339761-l2gb8" Oct 13 20:01:53 crc kubenswrapper[4974]: I1013 20:01:53.039968 4974 generic.go:334] "Generic (PLEG): container finished" podID="4c30b33c-8e8d-4907-8d74-c3809c6ebeda" containerID="d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f" exitCode=0 Oct 13 20:01:53 crc kubenswrapper[4974]: I1013 20:01:53.040087 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-9wdjx/must-gather-knsvx" event={"ID":"4c30b33c-8e8d-4907-8d74-c3809c6ebeda","Type":"ContainerDied","Data":"d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f"} Oct 13 20:01:53 crc kubenswrapper[4974]: I1013 20:01:53.041185 4974 scope.go:117] "RemoveContainer" containerID="d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f" Oct 13 20:01:53 crc kubenswrapper[4974]: I1013 20:01:53.143019 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9wdjx_must-gather-knsvx_4c30b33c-8e8d-4907-8d74-c3809c6ebeda/gather/0.log" Oct 13 20:02:05 crc kubenswrapper[4974]: I1013 20:02:05.202072 4974 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-9wdjx/must-gather-knsvx"] Oct 13 20:02:05 crc kubenswrapper[4974]: I1013 20:02:05.202839 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-9wdjx/must-gather-knsvx" podUID="4c30b33c-8e8d-4907-8d74-c3809c6ebeda" containerName="copy" containerID="cri-o://20bc1ea13b3d12d01a68eb843955f930aa491ea4ca1fc1bc6c1ad1c80b36a832" gracePeriod=2 Oct 13 20:02:05 crc kubenswrapper[4974]: I1013 20:02:05.215470 4974 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-9wdjx/must-gather-knsvx"] Oct 13 20:02:05 crc kubenswrapper[4974]: I1013 20:02:05.712920 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9wdjx_must-gather-knsvx_4c30b33c-8e8d-4907-8d74-c3809c6ebeda/copy/0.log" Oct 13 20:02:05 crc kubenswrapper[4974]: I1013 20:02:05.713497 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/must-gather-knsvx" Oct 13 20:02:05 crc kubenswrapper[4974]: I1013 20:02:05.899298 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmbhh\" (UniqueName: \"kubernetes.io/projected/4c30b33c-8e8d-4907-8d74-c3809c6ebeda-kube-api-access-jmbhh\") pod \"4c30b33c-8e8d-4907-8d74-c3809c6ebeda\" (UID: \"4c30b33c-8e8d-4907-8d74-c3809c6ebeda\") " Oct 13 20:02:05 crc kubenswrapper[4974]: I1013 20:02:05.899524 4974 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c30b33c-8e8d-4907-8d74-c3809c6ebeda-must-gather-output\") pod \"4c30b33c-8e8d-4907-8d74-c3809c6ebeda\" (UID: \"4c30b33c-8e8d-4907-8d74-c3809c6ebeda\") " Oct 13 20:02:05 crc kubenswrapper[4974]: I1013 20:02:05.908528 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c30b33c-8e8d-4907-8d74-c3809c6ebeda-kube-api-access-jmbhh" (OuterVolumeSpecName: "kube-api-access-jmbhh") pod "4c30b33c-8e8d-4907-8d74-c3809c6ebeda" (UID: "4c30b33c-8e8d-4907-8d74-c3809c6ebeda"). InnerVolumeSpecName "kube-api-access-jmbhh". PluginName "kubernetes.io/projected", VolumeGidValue "" Oct 13 20:02:06 crc kubenswrapper[4974]: I1013 20:02:06.002462 4974 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmbhh\" (UniqueName: \"kubernetes.io/projected/4c30b33c-8e8d-4907-8d74-c3809c6ebeda-kube-api-access-jmbhh\") on node \"crc\" DevicePath \"\"" Oct 13 20:02:06 crc kubenswrapper[4974]: I1013 20:02:06.105847 4974 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c30b33c-8e8d-4907-8d74-c3809c6ebeda-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "4c30b33c-8e8d-4907-8d74-c3809c6ebeda" (UID: "4c30b33c-8e8d-4907-8d74-c3809c6ebeda"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Oct 13 20:02:06 crc kubenswrapper[4974]: I1013 20:02:06.180900 4974 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-9wdjx_must-gather-knsvx_4c30b33c-8e8d-4907-8d74-c3809c6ebeda/copy/0.log" Oct 13 20:02:06 crc kubenswrapper[4974]: I1013 20:02:06.181277 4974 generic.go:334] "Generic (PLEG): container finished" podID="4c30b33c-8e8d-4907-8d74-c3809c6ebeda" containerID="20bc1ea13b3d12d01a68eb843955f930aa491ea4ca1fc1bc6c1ad1c80b36a832" exitCode=143 Oct 13 20:02:06 crc kubenswrapper[4974]: I1013 20:02:06.181321 4974 scope.go:117] "RemoveContainer" containerID="20bc1ea13b3d12d01a68eb843955f930aa491ea4ca1fc1bc6c1ad1c80b36a832" Oct 13 20:02:06 crc kubenswrapper[4974]: I1013 20:02:06.181422 4974 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-9wdjx/must-gather-knsvx" Oct 13 20:02:06 crc kubenswrapper[4974]: I1013 20:02:06.206548 4974 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/4c30b33c-8e8d-4907-8d74-c3809c6ebeda-must-gather-output\") on node \"crc\" DevicePath \"\"" Oct 13 20:02:06 crc kubenswrapper[4974]: I1013 20:02:06.222205 4974 scope.go:117] "RemoveContainer" containerID="d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f" Oct 13 20:02:06 crc kubenswrapper[4974]: I1013 20:02:06.315843 4974 scope.go:117] "RemoveContainer" containerID="20bc1ea13b3d12d01a68eb843955f930aa491ea4ca1fc1bc6c1ad1c80b36a832" Oct 13 20:02:06 crc kubenswrapper[4974]: E1013 20:02:06.316466 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20bc1ea13b3d12d01a68eb843955f930aa491ea4ca1fc1bc6c1ad1c80b36a832\": container with ID starting with 20bc1ea13b3d12d01a68eb843955f930aa491ea4ca1fc1bc6c1ad1c80b36a832 not found: ID does not exist" containerID="20bc1ea13b3d12d01a68eb843955f930aa491ea4ca1fc1bc6c1ad1c80b36a832" Oct 13 20:02:06 crc kubenswrapper[4974]: I1013 20:02:06.316523 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20bc1ea13b3d12d01a68eb843955f930aa491ea4ca1fc1bc6c1ad1c80b36a832"} err="failed to get container status \"20bc1ea13b3d12d01a68eb843955f930aa491ea4ca1fc1bc6c1ad1c80b36a832\": rpc error: code = NotFound desc = could not find container \"20bc1ea13b3d12d01a68eb843955f930aa491ea4ca1fc1bc6c1ad1c80b36a832\": container with ID starting with 20bc1ea13b3d12d01a68eb843955f930aa491ea4ca1fc1bc6c1ad1c80b36a832 not found: ID does not exist" Oct 13 20:02:06 crc kubenswrapper[4974]: I1013 20:02:06.316559 4974 scope.go:117] "RemoveContainer" containerID="d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f" Oct 13 20:02:06 crc kubenswrapper[4974]: E1013 20:02:06.317036 4974 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f\": container with ID starting with d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f not found: ID does not exist" containerID="d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f" Oct 13 20:02:06 crc kubenswrapper[4974]: I1013 20:02:06.317087 4974 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f"} err="failed to get container status \"d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f\": rpc error: code = NotFound desc = could not find container \"d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f\": container with ID starting with d946cda6e0f47c1a2e505bb3cefcb4cb710d96c55c7d81d740f4a865b113e44f not found: ID does not exist" Oct 13 20:02:07 crc kubenswrapper[4974]: I1013 20:02:07.743597 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 20:02:07 crc kubenswrapper[4974]: I1013 20:02:07.745257 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 20:02:07 crc kubenswrapper[4974]: I1013 20:02:07.829371 4974 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c30b33c-8e8d-4907-8d74-c3809c6ebeda" path="/var/lib/kubelet/pods/4c30b33c-8e8d-4907-8d74-c3809c6ebeda/volumes" Oct 13 20:02:37 crc kubenswrapper[4974]: I1013 20:02:37.742796 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 20:02:37 crc kubenswrapper[4974]: I1013 20:02:37.743521 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 20:02:52 crc kubenswrapper[4974]: I1013 20:02:52.428119 4974 scope.go:117] "RemoveContainer" containerID="a468f0852551094ac26b775e3582efbc8a57446b0f244c7c1a04d5c4c9c813c0" Oct 13 20:03:07 crc kubenswrapper[4974]: I1013 20:03:07.743438 4974 patch_prober.go:28] interesting pod/machine-config-daemon-xpb6b container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Oct 13 20:03:07 crc kubenswrapper[4974]: I1013 20:03:07.745985 4974 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Oct 13 20:03:07 crc kubenswrapper[4974]: I1013 20:03:07.746276 4974 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" Oct 13 20:03:07 crc kubenswrapper[4974]: I1013 20:03:07.747980 4974 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fc2b28bf2a1af49a262232e7c9a21172aeb3f7c941e6129ac71d99b29609819a"} pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Oct 13 20:03:07 crc kubenswrapper[4974]: I1013 20:03:07.748350 4974 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" podUID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerName="machine-config-daemon" containerID="cri-o://fc2b28bf2a1af49a262232e7c9a21172aeb3f7c941e6129ac71d99b29609819a" gracePeriod=600 Oct 13 20:03:07 crc kubenswrapper[4974]: I1013 20:03:07.947322 4974 generic.go:334] "Generic (PLEG): container finished" podID="013d968f-6cef-476b-a6fc-88d396bd5cd1" containerID="fc2b28bf2a1af49a262232e7c9a21172aeb3f7c941e6129ac71d99b29609819a" exitCode=0 Oct 13 20:03:07 crc kubenswrapper[4974]: I1013 20:03:07.947371 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerDied","Data":"fc2b28bf2a1af49a262232e7c9a21172aeb3f7c941e6129ac71d99b29609819a"} Oct 13 20:03:07 crc kubenswrapper[4974]: I1013 20:03:07.947404 4974 scope.go:117] "RemoveContainer" containerID="5733e8afb77126a77b18c88af55d24463ec876074e191edc47193a4142b55f40" Oct 13 20:03:08 crc kubenswrapper[4974]: I1013 20:03:08.960501 4974 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-xpb6b" event={"ID":"013d968f-6cef-476b-a6fc-88d396bd5cd1","Type":"ContainerStarted","Data":"485767c4fa1dfad4dc78fb5f325bb84d40c0195a8ca410a73fb72df69b4f28d5"} Oct 13 20:03:52 crc kubenswrapper[4974]: I1013 20:03:52.561961 4974 scope.go:117] "RemoveContainer" containerID="75a4e09b24560b88017a1595b8d03e492f9425ed5c0c8b56f8fe32f3ef1b4102" Oct 13 20:03:52 crc kubenswrapper[4974]: I1013 20:03:52.609076 4974 scope.go:117] "RemoveContainer" containerID="0c42ee3ab294d92bc36f3f61176dc56791b918b3898a70fd2a7ed3131fa4868b"